18 April 2014

The Ruby Reflector

Topic

Amazon S3

  Source Favicon
By Ryan Stenberg of Viget.com Blogs 1 month ago.
Email

…interesting challenge with this particular data migration when one of the columns was used to store an Amazon S3 URL for a CarrierWave-uploaded file. CarrierWave is pretty magical, but we found it can be difficult to work with when trying to migrate existing data around a destructive Rails migration (where we make structural changes to our database that result in a loss of data). In this post, I'd like to share our experiences and two of the approaches we experimented with.

An Example …

viget.com Read
  Source Favicon
By Hongli Lai of Phusion Corporate Blog 3 months ago.
Email

Binaries are now downloaded from an Amazon S3 mirror if the main binary server is unavailable .

And finally, although this isn't really a change in 4.0.34, it should be noted. In version 4.0.33 we changed the way Phusion Passenger's own Ruby source files are loaded, in order to fix some Debian and RPM packaging issues. The following doesn't work anymore: require 'phusion_passenger/foo'

Instead, it should become: PhusionPassenger.require_passenger_lib …

blog.phusion.nl Read
  Source Favicon
By Hongli Lai of Phusion Corporate Blog 3 months ago.
Email

Setting up an Amazon S3 mirror for high availability. If the main server is down, Phusion Passenger should automatically download from the mirror instead. We're currently working on this.

The goal is to finish all these items this week and to release a new version that includes these fixes. We're working around the clock on this.

Workarounds for now

Users can apply the following workaround for now in order to prevent Phusion Passenger from freezing during downloading of …

blog.phusion.nl Read
  Source Favicon
By Mislav of Mislav's blog 4 months ago.
Email

The result is the cached-bundle script whose entire core logic can be seen below. It delegates the Amazon S3 upload logic to a separate s3-put script:

cache_name = "${TRAVIS_RUBY_VERSION}-${gemfile_hash}.tgz" fetch_url = "http://${AMAZON_S3_BUCKET}.s3.amazonaws.com/${TRAVIS_REPO_SLUG}/${cache_name}" if download "$fetch_url" "$cache_name" ; then tar xzf "$cache_name" fi bundle "$@" if [ ! -f "$cache_name" ] ; then tar …

mislav.uniqpath.com Read
  Source Favicon
By Klampaeckel of till's blog 6 months ago.
Email

…install , there are a lot of rount-trips between the server, our satis and Github (or Amazon S3).

One of my first ideas was to to get around a continous reinstall by symlinking the vendor directory between releases. This doesn't work consistently for two reasons:

What's a release?

OpsWorks, or Chef in particular, calls deployed code releases.

A release is the checkout/clone/download of your application and lives in /srv/www : srv/ └── www └── my_app …

till.klampaeckel.de Read
  Source Favicon
By kahfei of kahfei 11 months ago.
Email

"A bucket is a container for objects stored in Amazon S3. When creating a bucket, you can choose a Region to optimize for latency", that is what mentioned in the guide, so I chose Singapore as it is closest to my location.

Now if you have chosen " US standard" region, things will work out much straightforward, any non-us region will need some tweaks in the configuration. The documentation in heroku did mentioned that some international users may need to override …

kahfei.com Read
  Source Favicon
By Todd Hoff of High Scalability 12 months ago.
Email

…Auto Scaling can save costs by better matching demand and capacity. Certainly not a new idea but the diagrams, different leakage scenarios (daily spike, weekly fluctuation, seasonal spike), and the explanation of potential savings (substantial) are well done.

Use Amazon S3 Object Expiration feature to delete old backups, logs, documents, digital media, etc. A leakage of ~20 TB adds up to a tidy ~ 1650 USD a year.

highscalability.com Read
  Source Favicon
By Klampaeckel of till's blog 1 year ago.
Email

Whatever you find and use — make a copy of it and put it on Amazon S3 or the local network. With larger teams even a local Ubuntu mirror (or whatever you use) can come in handy.

This includes base boxes, packages, etc.. Nothing is more annoying than waking up and not being able to bootstrap your VMs because someone decided to remove something in order to force you to upgrade.

Don't dumb it down!

Typically, PHP applications are developed on a single host — Apache, PHP and …

till.klampaeckel.de Read
  Source Favicon
By Assaf of Labnotes 1 year ago.
Email

…cross-domain cookie attackes from hosted content.

§ Taming The Unicorn: Easing JavaScript Memory Profiling In DevTools .

§ S3CP: Commands-line tools for Amazon S3 file manipulation : s3cp, s3ls, s3cat, s3rm, etc.

§ Git koans . I laughed. I cried. I am enlightened.

§ How to Get Your Front End Developers to Fix Things .

blog.labnotes.org Read
  Source Favicon
By ryan.huddleston of MySQL Performance Blog 1 year ago.
Email

Amazon S3

monitoring for all the above

Philosophy on backups

It is a good idea to schedule both logical and binary backups. They each have their use cases and add redundancy to your backups. If there is an issue with your backup, it's likely not to affect the other tool.

Store your backups on more than one server.

In addition to local copies, store backups offsite. Look at the cost of S3 or S3+Glacier, it's worth the peace of mind!

Test your backups, and if you have a …

mysqlperformanceblog.com Read