We're going to use mongrel_cluster to manage your small set of Mongrel applications. We'll need the mongrel_cluster gem, and we'll first just set up the testapp so you can see how it's done. This will lay the groundwork for your real production application.
5.5.2. A Simple Test Rails Application
There's a bit of a chicken and eggs problem with this kind of setup. In order to test that your web1 machine works with your app1 machine, you need to have a working application on app1. But you can't really get your application working without web1 and db1 working. The best thing to do is to create a small "test" application and serve it manually until you're ready for the big time:
$ cd /var/www/apps $ rails testapp $ cd testapp $ mongrel_rails start
You should then be able to visit http://localhost:3000/ and see the Rails test page. That means your Mongrel setup is working, but you'll want something that also shows Rails is working. Stop Mongrel with CTRL-C and do:
$ script/generate controller test
Edit app/controllers/test_controller.rb and add this to the TestController class:
def index render :text => "test" end
Then start up Mongrel again and go to http://localhost:3000/test to see your little tester Rails application.
If you haven't yet, stop your Mongrel instance with CTRL-C and install the mongrel_cluster gem:
$ gem install mongrel_cluster
Next we'll want to configure our testapp instance so that it is running three instances of mongrel starting at port 8000:
$ sudo mongrel_rails cluster::configure -e production \ -p 8000 -N 3 -c /var/www/apps/testapp -a 127.0.0.1 \ --user mongrel --group mongrel
This creates a configuration file mongrel_cluster.yml and puts it in /var/www/apps/testapp/config. Now you can start this cluster using:
$ cd /var/www/apps/testapp $ mongrel_rails cluster::start
And of course the command cluster::stop will stop it. You should now (with the cluster still running) try to hit each port (8000, 8001, 8002) with your browser and make sure the test pages are present.
We're going to leave this running in this for now while we set up the web1 Apache server. Then we'll come back and finalize the install by putting your app on, configuring it, and putting it into the boot process for app1.
Refer to Section 4 to learn how to get your Apache 2.2.x install up and running. If you're using virtual hosts then make sure they are configured correctly and serving the files for your domain name.
Once you have everything running you'll need to change your configuration to point your virtual host at the app1 servers:
<Proxy balancer://mongrel_cluster> BalancerMember http://app1:8000 BalancerMember http://app1:8001 BalancerMember http://app1:8002 </Proxy>
Change app1 to be either your real hostname or the IP address. Restart your Apache instance according to your system's process restart method and hit http://web1/ to see if your application comes up. If not, double-check the following:
You now have your web1 server balancing against your app1 Mongrel cluster and you're ready to get your database configured on db1. First thing, though, is you should have been using Rails migrations this whole time. It was mentioned in Section 5.1 as something you should do, but it's pretty much a requirement at this point. Later you'll have to deploy your schema to MySQL and then manage versioning of the schema, so if you want to be a hero and do them manually, make sure your SQL scripts are set up properly.
If you don't have migrations, you're on your own (and probably very much in trouble). You'll have to go through the MySQL configuration and just make it up as you go.
What you need to do is first make sure MySQL is running and configured properly and that you can use the mysql tool to connect to localhost on db1. Once you can do that, follow these steps:
Until you can connect manually from app1 to db1, you shouldn't continue. Permissioning and user management in MySQL is rather weird and painful (well, with any DBMS), so we usually break down and install phpMyAdmin or webmin to manage it. Yeah, we're evil.
5.5.6. Last Step: Your Application in Production
These instructions are intended to get you up and running quick with a basic best practice deployment, but also teach you how everything is installed and managed. In reality everyone who's half smart about anything uses Capistrano to do their deployments. Capistrano is excellent software that uses Rake and ssh to automate application deployments. It can manage fairly large installations, restart servers, roll back failed deployments, and provide status as well.
The problem with Capistrano is that until you've done one deployment manually it's difficult to use Capistrano to automate your installation. We usually do the first deployment by hand in a new location, and then we step back and plan the automation with Capistrano. As you gain experience doing this you'll be able to start off right away with Capistrano, but for now we're going to do this first deployment by hand.
It is a sin of the highest degree to continue doing deployments manually once you've done your first one by hand. Capistrano is a fantastic tool that is easy to understand if you know even a small amount about build tools (Rake, Make, SCons, nmake). Taking the time after this deployment to automate your work will save you much pain and anger in the future.
184.108.40.206. Get Your Application On
You've got two approaches to follow to get your code onto the app1 server:
Let us assume that you've got your code in Subversion and you're going to set up your application just like your testapp:
$ cd /var/www/apps $ svn co http://mysvn/repository/myapp $ sudo mongrel_rails cluster::configure -e production \ -p 8000 -N 3 -c /var/www/apps/myapp -a 127.0.0.1 \ --user mongrel --group mongrel $ cd testapp; mongrel_rails cluster::stop $ cd ../myapp; mongrel_rails cluster::start
Your application should start up and you should see tons of errors. What you must do now is configure this deployment so that it uses your db1 database, runs your rake db:migrate task, and completes any configurations your application specifically needs. It might be a good idea during this stage to not use mongrel_cluster but to just run the application manually with mongrel_rails start -p 8000 and test against http://app1:8000/ until the application works.
Halt! Before you start to change the files you just checked out, take a step back. You have a revision control tool that helps you keep track of your source changes, but if you edit files outside of this process then you'll run into conflicts later. Since you've checked the source out, you have three choices that make sense:
Our general process is to do #1 for the first deployment since it's quicker and more direct, then when we automate with Capistrano we use #3 to keep things sane and safe.
220.127.116.11. Systemize Your Deployment
The mongrel_cluster gem comes with an /etc/init.d startup script that works on most linux systems. There's a small process you can do so that your newly minted and functioning application will be started whenever the machine reboots.
$ mkdir /etc/mongrel_cluster $ ln -s /var/www/apps/testapp/config/mongrel_cluster.yml \ /etc/mongrel_cluster/testapp.yml $ cp /path/to/mongrel_cluster_gem/resources/mongrel_cluster \ /etc/init.d/ $ chmod +x /etc/init.d/mongrel_cluster
Then you'll need to configure your system to run /etc/init.d/mongrel_cluster on start/stop and you're set. You should try rebooting your machine to make sure the application actually does start properly.
18.104.22.168. The Final Get-Together
The only remaining task is to actually make sure all the pieces work. It's a good idea to test each component in order from back to front, and then test them as one whole. If you were smart you'd also automate this test process and include it in your post-production deployment process.
22.214.171.124. Cheap Simple Caching
Normally you'll need to set up Apache mod_rewrite rules of varying complexity levels to take advantage of Rails' page caching. Since you have a front-facing web1 server you'll have to do some research to find a way to share disk with app1. This is fairly complex but there is a quick and dirty way around it for many people's applications: mod_cache.
<IfModule mod_cache.c> LoadModule mem_cache_module modules/mod_mem_cache.so CacheEnable mem / MCacheSize 4096 MCacheMaxObjectCount 100 MCacheMinObjectSize 1 MCacheMaxObjectSize 2048 </IfModule>
This configuration is only a sample taken from the official mod_cache documentation, but it demonstrates the trick. This will cache the majority of your content and will work as long as your content doesn't require immediate updating. There are ways to also exclude some locations, to cache to disk, and to control the cache organization. Using the memory cache often works well, as you can just bounce the server after a deployment to reset it, but the details of your application will determine if it is good fit for you.
Another trick you should investigate is using an asset server in Ruby on Rails. You make this setting in config/environments/production.rb:
config.action_controller.asset_host = "http://assets.web1"