This is just a minor note but a really powerful piece of information. Eric Stevens of the B And E Blog did some memory usage testing involving ColdFusion 8 and the CFContent tag and the findings are very cool:
I just did a test in CF 8 using FusionReactor to watch memory while using <cfcontent type="application/x-zip-compressed" file="#ExpandPath('bigfile.zip')#">
Memory was at 141mb when the file started, memory was at 151 when the file transfer completed. The file was 302 meg, and the transfer took about 2 minutes. The memory graph never spiked, and never went over 151 meg, though it did clearly do a garbage collection in the middle of the transfer and drop down to 141 meg before rising gradually to 151 again.
Long story short: it looks like <cfcontent file="..."> uses buffering to transfer the file, it doesn't consume an inappropriate level of resources.
Not to reiterate what he just said, but it looks like our assumptions were wrong - ColdFusion's CFContent tag is being smart about the way it reads in and streams file data to the client browser. Clearly, memory usage should not be a concern when it comes to the use of CFContent to stream secure files out of a document library. And, as I have been told before, tying up the thread to stream the file is not much of a concern either; I have no real evidence about the thread usage one or the other, so I just have to go with what I have been told.
Based on the evidence and hearsay, ColdFusion's CFContent tag is nothing but totally awesome and should be used whenever necessary. Or is that too big of a leap in faith?
Looking For A New Job?
- Mid-Level Developer - Remote at Meeting Play
- Cold Fusion Developer/Designer at BPO Elks of the USA
- 10 year + CF lead Programmer/Developer with expert dot net/sql skills at Atprime Media Services
- ColdFusion Developer (advanced) at Intoria Internet Architects
- Full-time, remote CF Developer for Motorsport SaaS Company at MotorsportReg.com
This is good to know, thanks.
Then one last vice of CFCONTENT may then be the confirmed(?) issue of throttled transfer speed?
Back in the day there had been lots of discussion about slower download speeds when using CFCONTENT.
I'm not sure I ever saw this disproved or fixed or what though, but it always made me feel a little dirty when using CFCONTENT to secure file access. Perhaps someone can shed the light on this.
It might be slightly unrelated to the original test, but what happens under load?
2 concurrent, 5 concurrent, 10?
The stand alone test proves a point, but in the real world, does that hold up?
That's a good point Brian.
Just now I did 5 simultaneous requests, and the only impact I noticed was increased CPU usage (ProcessExplorer had it almost all in disk IO, probably from me simultaneously reading and writing hundreds of meg to my disk under a VM environment). The only impact on memory seemed to be the climb & garbage collection was a little faster (not even 5x as frequent, maybe only 2x as frequent).
So, if there was any performance issues, it seems that this would be result of the file system, not necessarily with CFContent. Meaning, that you would see load issues on any server side scripting language because it seems the file reading and the bandwidth becomes the limiting factor. Does that seem correct?
Yes. And further, I believe that I only saw this much CPU usage because I was working my disk much harder than the server would (I was reading off the same disk drive as I was writing files to, all from within a virtual machine - which has reduced disk performance to begin with).
CF's CPU time was around 15-20%, the rest was consumed by 5 copies of WGet, all of these processes showed high IO time in Process Explorer. These are similar characteristics to what my CPU looks like when I bypass CF and download the files directly from the server.
Rock on. Do you realize how cool your findings are??
When we use cfcontent to create a secure document library on an SSL enable site, IE browsers can't download the file. We've tried a number of cfheader and cfcontent options without luck. Firefox, Opera, etc work great, but IE fails saying it can't find the file.
Of course, I don't find the answer until I post a comment. For any other poor sap who has this problem, set these values:
<cfheader name="Pragma" value="">
<cfheader name="Cache-control" value="">
Thanks for posting the solution! I hope this helps others.
Using CFCONTENT should be strongly discouraged on high traffic sites it is EXTREMELY resource hungry and unreliable. One of our clients has a very busy site, about 60GB per day of file downloads. The site all but crashed and burned using cfcontent, not to mentioned files even 50+ MB constantly timed out while downloading.
I am embarrassed to admit we had to rewrite the entire secure download sequence using php (readfile()) and never had any further problems.
I am not trying to poo-poo this article just wanted to point out it does not speak to real world high volume scenarios and Adobe has a lot of work to do in this area to catch up to php and even .net (Response.TransmitFile) in general most file management capabilities in cf are not usable in high volume sites.
We use cfcontent to serve product images (from thumbnails through full-sized images) for a high volume ecommerce site with around 4,000 products on it. I'm limited in how many specifics I can go into, but let's just say it's big in terms of volume, revenue, and physical topography. Our images are basically served out of an Oracle database with cfcontent so that we don't have to worry about replicating so many product images (that are updated frequently by our business users) as files to our whole cluster.
We've never had a problem with it. The tests Ben quoted me on were part of the research into whether this was feasible or would be a resource / performance hog.
Doing it well is less trivial than it seems. It's easy to take for granted how much work the authors of web servers put into things like good caching policies and various optimizations. It took us a while to really tune in our caching policy, including pregenerating etags (dynamic etags from a clustered environment being different for each physical server out of both Apache and IIS is one of the big benefits we gained from cfcontent).
All that said, CF is not as lightweight or as fast as PHP. There's a considerably higher per-page-request overhead for CF, so if you can get acceptable performance from PHP but not CF and don't have to give up any security / functionality, then of course go with what works. I'm a PHP man myself actually, I've just done hybrid language sites before, and believe a homogeneous technology stack is better when feasible.
It's nice to get the different experiences on high volume sites. I have never personally had to deal with either such large files or such high volume.
Question: I'm pretty new to ColdFusion, but am more experienced running J2EE type boxes. I'm trying to set up a ColdFusion dev server on a VM with limited memory, and I'm trying to trim it back so that it doesn't pork out and use every bit of memory on any computer within a 10 mile radius. I.e. trying to get it so that it perhaps doesn't fire up all of the Flash remoting servlets or other things I don't need for basic CF functionality, in hopes that this might trim back the RAM usage. Any help you might be able to be on this?
This is a bit out of my area of expertise. Perhaps someone else here can lend a suggestion?
This might not be the best place for this comment, since this is about my recent battle with CFContent on CF MX 7 (not 8).
I support an application which lets users download PDFs based on the links they select. The PDFs are stored as BLOBs in an Oracle database.
Originally this application is was built on CF 5 and I am in the process of migrating it to CF MX 7.
After transferring all my code to CF 7, and making the other obvious changes (like Maxrows being no more supported on CFProcParam), I got to testing if the PDFs were being queried and presented to the browser all fine.
The first PDF came good, I was happy! the second resulted in a "Server Error - JRun starting up or too many concurrent requests".
On hitting the back button I clied on the link one more time, and this time the PDF came up fine.
In my mind, I though the basic difference between CF 5 and CF 7 is JRun, and I was convinced it was some sort of a JRun issue. Because of the nature of the error I was getting, I spent around one week trying to tune the activeHandlerThreads, maxHandlerThreads etc. No Luck!
Then I came across online literature saying how CFContent can kill your server if its trying to flush a huge file to the browser. My PDFs are not very huge, a little more than 100 kb. I have no option but to shift my investigation towards the CFContent in my code.
I decided to substitute CFContent with ..
<CFHeader name ="Content-Type" value ="application/pdf">
Note the use of the ToString function, binary code can not be processed inside <cfoutput>..</cfoutput> directly.
Also note that there is absolutely no space or line breaks between the <cfheader> and the <cfoutput> and the ToString.
These precautions are not required when using <cfcontent> and I learned this after a lot of trial-and-error.
Also when substituting CFHeader for CFContent, one must make sure no other output from no other template is being sent to the browser, not even the debugging output. I got corrupted PDFs for a long time until I figured these wonderful facts about life and CFHeader.
And then.. in the magical moment.. when the angel of wisdom whispered all the right syntax in my ear, I got the perfect PDF. No server error anymore. In disbelief, I tried all I could to break the server and see the server error once again. But it was gone for good.
I just want any other poor soul suffering from JRun busy errors to see if my experience can help him save a lot of time and headache.
Thanks everyone, for reading patiently.
Very interesting. It would not occur to me that you could simply convert a binary value to a string and output it to the rendered content. That's pretty cool.
I wonder why this doesn't create too many concurrent requests while CFContent does; after all, I think it's the same number of requests. Perhaps this is a CF7-specific behavior?
I followed up on this a little further to see if I could do the on-the-fly PDF magic in CF 5
Turns out, I can not. And for the same reason that CF5 doesn't let me use ToString on binary variable :(.
Hi Ben, I'm very interested in using CfContent on CF8 to send 800MEG ISO's from a secured location.. I've been using a filechunking ASP.NET script and it does the trick, lets me update the database with how many bytes were transferred at the end of the session, or if the session timesout, so I can log if the user successfully completed the transfer.
I'm a big fan of cold fusion as the site is written in CF, I only used ASP.NET because CFContent wasn't chunking. My question is.. IS there are way to allow for a Resume of a large ISO using this method?
Many users have the transfer abort, and since we pay per GB xferred the failed transfers cost us.
The .NET Application I'm using had a section talking about HTTP handlers, and Auto-Resuming Chunked file fetches, but not being a real programmer it was totally over my head. requireing the compilation of DLL's.
I don't have any good answers for you, I'm sorry. But it is a problem that is terribly interesting (and not uncommon). I'll see if I can do some research.
I bumped into problem with CFContent and found your article. CF failed to serve 1GB files using CFContent and left me clueless.
In case anyone has comments on this, noted my humble experience here http://rodionbykov.com/default/index.cfm/web/coldfusion/failed-download-of-large-files-with-cfcontent/
PING! Hey anyone find a solution to this? The bug is still open with Adobe, and it's just killing us here. I might just have to hack it out in Java, but just not sure how to get the pointer to the output stream.
If it's causing you problems, vote for it!
I know this is a little late to the party, but I is it possible you're running into the Request Throttle Memory limit?
CF Admin --> Server Settings --> Settings. Down at the bottom you have the minimum request size to trigger throttle and the maximum size of request served before rejection.
The bug with the serving of large files has been resolved by Adobe (yeah!) in it's latest fix: http://kb2.adobe.com/cps/918/cpsid_91836.html
Reference bug id # 83425
Anyone successfully install CF 9.0.1 Cummulative Hotfix 2? Ater installing, I couldn't the CF Admin page working.
I have multiple instance of CF Enterprise 9.0.1 with clustering and having cfcontent issue if downloading a file over 300MB. Otherwise, it is working fine.