How to deny the web access to some files?

I need to do an operation a bit strange.

First, i run on Debian, apache2 (which 'runs' as user www-data)

So, I have simple text file with .txt ot .ini, or whatever extension, doesnt matter.

These files are located in subfolders with a structure like this:

www.example.com/folder1/car/foobar.txt www.example.com/folder1/cycle/foobar.txt www.example.com/folder1/fish/foobar.txt www.example.com/folder1/fruit/foobar.txt

therefore, the file name always the same, ditto for the 'hierarchy', just change the name of the folder: /folder-name-static/folder-name-dinamyc/file-name-static.txt

What I should do is (I think) relatively simple: I must be able to read that file by programs on the server (python, php for example), but if I try to retrieve the file contents by broswer (digiting the url www.example.com/folder1/car/foobar.txt, or via cUrl, etc..) I must get a forbidden error, or whatever, but not access the file.

It would also be nice that even accessing those files via FTP are 'hidden', or anyway couldnt be downloaded (at least that I use with the ftp root and user data)

How can I do?

I found this online, be put in the file .htaccess:

<Files File.txt>
 Order allow, deny
 Deny from all
</ Files>

It seems to work, but only if the file is in the web root (www.example.com / myfile.txt), and not in subfolders. Moreover, the folders in the second level (www.example.com/folder1/fruit/foobar.txt) will be dinamycally created.. I would like to avoid having to change .htaccess file from time to time.

It is possible to create a rule, something like that, that goes for all files with given name, which is on *www.example.com/folder-name-static/*folder-name-dinamyc/***file-name-static.txt*, where those parts are allways the same, just **that one change ?

EDIT:

As Dave Drager said, i could semplify this keeping those file outside the web accessible directory. But those directory's will contain others files too, images, and stuff used by my users, so i'm simply try to not have a duplicate folders system, like:

/var/www/vhosts/example.com/httpdocs/folder1/car/[other folders and files here]
/var/www/vhosts/example.com/httpdocs/folder1/cycle/[other folders and files here]
/var/www/vhosts/example.com/httpdocs/folder1/fish/[other folders and files here]

//and, then for the 'secrets' files:

/folder1/data/car/foobar.txt
/folder1/data/cycle/foobar.txt
/folder1/data/fish/foobar.txt

You could use Files/FilesMatch and a regular expression:

<Files ~ "\.txt$">
    Order allow,deny
    Deny from all
</Files>

This is how .htpasswd is protected.

or redirect any access of .txt to a 404:

RedirectMatch 404 \.txt$

From the Order documentation section:

Keywords may only be separated by a comma; no whitespace is allowed between them.

So the following is incorrect:

<Files File.txt>
Order allow, deny
Deny from all
</ Files>

The following is (more) correct

<Files "File.txt">
Order allow,deny
Deny from all
</Files>

Yes, this is all possible. However if you have programs on the server that need to access text files, they should reside out of the web root. For example, if your web files are in ~/public_html/, you should store them in ~/data. There is no reason to place them in the public html folder.

In addition to removing them from the web folder, make sure unix permissions are set correctly on them. The user of the scripts would need to have read (write?) access but anyone else does not have to have this access.

If you DO need to have these files in a web accessible folder, there are ways to do what you are asking via mod_rewrite. See the follow site for many examples, one of which should fit the bill.

http://perishablepress.com/press/2006/01/10/stupid-htaccess-tricks/