I am using cflocation to redirect all users to a temporary page. What I am wondering is if there is a way to have cflocation randomly or, even better, non-sequentially choose from a list of temporary pages to redirect users to. Is this possible?
This is actually a really easy task. ColdFusion provides very accessible randomization methods that we can leverage to get this done. What I'm gonna do it just put the list of temporary pages into an array and let ColdFusion randomly select from that array:
<!--- Create an array of URLs to which we are redirecting. ---> <cfset arrPages = ArrayNew( 1 ) /> <!--- Add the temporary pages to the array. ---> <cfset ArrayAppend( arrPages, "./temp/one.cfm" ) /> <cfset ArrayAppend( arrPages, "./temp/two.cfm" ) /> <cfset ArrayAppend( arrPages, "./temp/three.cfm" ) /> <cfset ArrayAppend( arrPages, "./temp/four.cfm" ) /> <!--- Because we are using this as a "Temporary Page", we can use a standard CFLOcation tag. If we wanted this to be more permanent, I would recomment using the CFHeader tag to provide the propery status code (or CFLocation if you are using CF8). Let ColdFusion select a random page using RandRange() and the size of the pages array. ---> <cflocation url="#arrPages[ RandRange( 1, ArrayLen( arrPages ) ) ]#" addtoken="false" />
Because the URL selection uses the actual array of pages to determine the random number assignment, you can update the array and it will automatically update the random redirects. RandRange() takes a lower and upper number range and will select a random integer from the range with the two ends inclusive.
I hope that helps.
Why would you want to redirect users to a random page??
I only answer the questions - I don't know why they're asked.
@Christopher and Ben
It's not a totally unusual request in community sites. When you have thousands or millions of members, it's quite a cool feature to randomly be redirected to a random user page or submission to learn about users you wouldn't normally encounter.
deviantART for instance has Random Deviation (submission) and Random Deviant (user).
Of course Ben's example wouldn't scale for them since they have 2 million accounts and millions of submissions. Instead you'd need to generate a random number (or several) and select from the database the record with that id. By generating several record ids randomly you can avoid selecting a deleted row because the probability goes down dramatically.
<cfset user = queryNew("username")>
<cfloop condition="#not user.recordCount#">
<cfset numbers = RandomArray(10)>
select top 1 username
where id in (<cfqueryparam value="#arrayToList(numbers)#" list="true">)
We can then cflocation like in Ben's code:
<cflocation url="http://#user.username#.riaforge.org" addtoken="false">
There is a possibility of an infinite loop in there, but the probability of never hitting a single record that's not deleted is so rare I doubt we'll ever come across that. You could add a check that if that loop ran N number of times to bail out too, if you were worried.
If hitting the database turns into a bottleneck, then cache a large number of random records (ex. 200), and select from them randomly without ever selecting duplicates, and that appears "random" to users, since most probably never hit it 200 times in a day. Getting clever you can avoid that issue too.
Or hit the dB up for a random row and don't worry about getting back a deleted row
Good use case, thanks.
I second Ben's "thanks". I've spent so much of my time working on boring (and lucrative) business/gov. apps that I couldn't think of a case where I'd want to apply randomness...other than the obvious "delete a random user" button that I just added to a system that I'm working on now ;)
>delete a random user
Ha ha ha :) I try to make that an implicit part of my systems - that way it takes the thinking out of it for my users.
If you read through the comments on that page you can see the various suggested methods don't scale well at all because they require generating random numbers for every row in the table.
If you have 62 million rows, that's no good.
Generating a finite key set and checking against that is O(k) where k is the key set size. This is constant. The index scan in the database is probably O(klog(n)). Since we never generate n keys (that'd be silly since it'd end up the same as the RAND() solution, we get O(klog(n)) type performance, which scales very well on large, partitioned, clustered indexed databases. Worst case we might have to do n key set generations, but honestly, that case is so rare...
On the other hand, the RAND() solution is always O(n) since it needs to generate that random number for every row in the database. Then we end up doing a O(nlog(n)) sort of the records in memory. So we end up with O(nlog(n)) as the runtime, worse though, is the fact that both the sort and the RAND() totally defeats the clustering and the indexing since we scan the entire table. :(
In any case, I suppose the right answer is "it depends". If your application is sufficiently modular you could refactor it later to handle the larger database by simply rewriting that one method.
Weird. I've used the SQL "NEWID()" on a table with a couple of million rows and did not have an issue with the data return. It seemed to chug along nicely. I guess if it was being hit with regularity (being on the front page for example), then it may bog the server down, but I hadn't had the issue when I ran it. I'll take your word for it as your knowledge seems to be O(K) on the subject :)
Haha, fair enough. I've not benchmarked it, so outside the theory I'm not quite sure what the exact metrics are.
That'd certainly be something fun to look at.