As jQuery for Designers continued to grow in interest, so did the video downloads.

The main problem my server faced was that when a large influx of traffic came in all at once, say if Smashing Magazine publishes a post covering a couple of tutorials, all the apache processes would be busy serving up 30Mb video files - traffic starts to build, the server starts to choke and it all goes downhill.

S3 to the rescue, but I also wanted to make sure I could keep track of the download stats.

UK EVENTAttend 2024

The conference for people who are passionate about the web. 8 amazing speakers with real human interaction and content you can't just read in a blog post or watch on a tiktok!

Step 1 - Upload to S3

For the Mac, I've traditionally used Cyberduck for transferring files. In the last 6 months or so, they added support to upload directly to S3 'buckets'. I've tried the Firefox plugin (it was a bit clunky) and before finding Cyberduck support S3, I had used JungleDisk.

The advantage of Cyberduck is that a) it's free, and b) it's super easy to use - it treats it just like a traditional connection.

Once the files are uploaded, you need to make sure the files have read permission set on everything you want your users to see.

Step 2 - Tidy URLs

So first off - this URL sucks:

It's too long, and ideally it matches the domain your site is running from.

Digital Magazine have a superb article on how to configure S3 to give you really clean URLs.

One tip I can offer when you're setting this up, you need to ensure your 'bucket' name matches the domain you redirect to.

In my case, it was going to be, so the bucket name (actually just a new directory if you're using Cyberduck) had to match exactly.

I could now access all the files I had uploaded to the bucket, and marked as read through this clean URL:

However, there's just one more simple step to keeping track of all the downloads.

Step 3 - Track

To completely track the downloads, and make use of nice clean URLs, we need two things:

  1. .htaccess
  2. A logging script (taken almost entirely from Linklove's analytics without JavaScript)

As mentioned, in my case the target URL is:

This is achieved by creating a logging script called media.php which requires the query string url=video.fmt. The actual URL the user will see is:

This way I can process the call via the media.php script, then redirect the user off the real location.


mod_rewrite is used to keep the URL clean and redirect the request to the right script:

<IfModule mod_rewrite.c>
Options +FollowSymLinks +ExecCGI
RewriteEngine On
RewriteBase /

RewriteRule ^media/(.*)$ /media.php?url=$1 [L]

# other rules here...

Logging Script

$var_utmac='UA-1234567-8'; // your urchin code
$var_utmhn=''; // your domain
$var_utmp='media/'.$_GET['url']; // this is the file name that will be logged in Google Analytics

// Shouldn't need to change these...
$var_utmn=rand(1000000000,9999999999); //random request number
$var_cookie=rand(10000000,99999999); //random cookie number
$var_random=rand(1000000000,2147483647); //number under 2147483647
$var_today=time(); //today
$var_referer=@$_SERVER['HTTP_REFERER']; //referer url
$var_uservar='-'; //enter your own user defined variable

// the Analytics URL - make sure this is still one line

// the request to Google is handled sent
$handle = fopen ($urchinUrl, "r");
$test = fgets($handle);

// finally we redirect the user to the real location of the file
header('Location:' . $_GET['url']);