I compared some puplar commercial CDN’s global latency by using just-ping.
The raw test result you can downloaded from here.
Result Summary:
|
|
|
|
. |
Network
| Average
| Median |
. |
Akamai
| 9.67
| 2.9 |
. |
Aol CDN
| 9.22
| 4 |
. |
Panthercdn
| 62.36
| 12.7 |
. |
LimeLight
| 58.04
| 13.2 |
. |
Mosso Cloud Files
| 56.81
| 13.5 |
. |
Amazon Cloudfront
| 62.82
| 18.6 |
. |
Google Homepage
| 53.53
| 23.15 |
. |
Cachefly
| 54.57
| 28.2 |
. |
Google Ajax Library
| 54.96
| 28.5 |
. |
Homemade CDN
| 76.31
| 29 |
. |
Yahoo Homepage
| 82.77
| 38.4 |
. |
Google App Engine
| 76.03
| 42.8 |
. |
US East
| 130.11
| 96.9 |
. |
SimpleCDN
| 142.84
| 100.8 |
. |
US West
| 156.32
| 165.4 |
|
Chart:
Notes:
- Akamai
- Aol CDN is served by Akamai
- Panther Express
- Limelight Networks
- Mosso is served by Limelight
- Amazon CloudFront Images on this page are served by CloudFront
- Google
- CacheFly
- Google AJAX Libraries
- This page is served by my own homemade CDN, you can test speed here
- Yahoo
- Google App Engine
- Single location ip in New Jersey USA listed here for comparison purpose
- SimpleCDN
- Single location ip in California USA listed here for comparison purpose
Disclamer: I am not affiliated to any company mentioned above. Nor do I endorse the accuracy of these results.
Yejun Web akamai, cdn, cdn comparison, cloudfront, comparison, mosso
If you use my WordPress CDN plugin and amazon cloudfront, you may have problem putting files into s3 storage. Here is a simple way without using any commercial tool if you are using Linux.
First download s3sync . Extract it to somewhere. In this example I used my home directory.
Edit ~/.s3conf/s3config.yml, which should looks like this
aws_access_key_id: your s3accesskey
aws_secret_access_key: your secret key
Enter wordpress directory
cd wordpress
find * -type f -readable \( -name \*.css -o -name \*.js -o \
-name \*.png -o -name \*.jpg -o -name \*.gif -o -name \*.jpeg \) \
-exec ~/s3sync/s3cmd.rb -v put bucket:prefix/{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 \;
Change bucket to your real bucket name. If you don’t need any prefix, do not include slash. Adjust cache-control header. ~/s3sync/s3cmd.rb should point to where you extracted s3sync.
Update 1: Don’t forget install mime-types if you Linux distro didn’t install it by default. Check whether /etc/mime.types exists.
Update 2: s3cmd.rb does not set content-type at all. I think python version does. Any way I wrote a script to redo everything.
#!/bin/sh
BUCKET=
#Set your bucket
PREFIX=
#If you want to use prefix set it like PREFIX=blog/
find * -type f -readable -name \*.css -exec ~/s3sync/s3cmd.rb -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:text/css \;
find * -type f -readable -name \*.js -exec ~/s3sync/s3cmd.rb -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:application/x-javascript \;
find * -type f -readable -name \*.png -exec ~/s3sync/s3cmd.rb -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/png \;
find * -type f -readable -name \*.gif -exec ~/s3sync/s3cmd.rb -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/gif \;
find * -type f -readable \( -name \*.jpg -o -name \*.jpeg \) -exec ~/s3sync/s3cmd.rb -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/jpeg \;
Update 3: I just realized cloudfront does not gzip files. So I rewrote my script to force gzip encoding on css and js files.
#!/bin/sh
BUCKET=
#Your bucket
PREFIX=
#If you want to use prefix set it like PREFIX=blog/
S3CMD=/home/user/s3sync/s3cmd.rb
#Your absolute path to s3cmd.rb
find * -type f -readable -name \*.css -exec sh -c "gzip -9 -c {} > /tmp/s3tmp && \
$S3CMD -v put $BUCKET:$PREFIX{} /tmp/s3tmp x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:text/css Content-Encoding:gzip" \;
find * -type f -readable -name \*.js -exec sh -c "gzip -9 -c {} > /tmp/s3tmp && \
$S3CMD -v put $BUCKET:$PREFIX{} /tmp/s3tmp x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:application/x-javascript Content-Encoding:gzip" \;
find * -type f -readable -name \*.png -exec $S3CMD -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/png \;
find * -type f -readable -name \*.gif -exec $S3CMD -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/gif \;
find * -type f -readable \( -name \*.jpg -o -name \*.jpeg \) -exec $S3CMD -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/jpeg \;
Update 4: 4/1/2009
I added function to copy a single file or single directory.
#!/bin/sh
if [[ -n $1 ]]; then
LOC=$1
else
LOC="*"
fi
BUCKET=
#Your bucket
PREFIX=
#If you want to use prefix set it like PREFIX=blog/
S3CMD=/home/user/s3sync/s3cmd.rb
#Your absolute path to s3cmd.rb
find $LOC -type f -readable -name \*.css -exec sh -c "gzip -9 -c {} > /tmp/s3tmp && \
$S3CMD -v put $BUCKET:$PREFIX{} /tmp/s3tmp x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:text/css Content-Encoding:gzip" \;
find $LOC -type f -readable -name \*.js -exec sh -c "gzip -9 -c {} > /tmp/s3tmp && \
$S3CMD -v put $BUCKET:$PREFIX{} /tmp/s3tmp x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:application/x-javascript Content-Encoding:gzip" \;
find $LOC -type f -readable -name \*.png -exec ${S3CMD} -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/png \;
find $LOC -type f -readable -name \*.gif -exec ${S3CMD} -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/gif \;
find $LOC -type f -readable \( -name \*.jpg -o -name \*.jpeg \) -exec ${S3CMD} -v put $BUCKET:$PREFIX{} {} \
x-amz-acl:public-read Cache-Control:max-age=604800 Content-Type:image/jpeg \;
For example if you saved this script to a file name cloudfront:
cd wordpress
cloudfront wp-content/uploads
Without any command line argument this script will upload all file under current directory.
Yejun Linux, Web amazon, aws, cloudfront, s3, s3cmd, s3sync