This a follow up on the first post about this subject from July 7 this year.
I have the multi site on one core Drupal 7 installation running for about 3 months now. Time to see how it all played out.
Until now I had no serious problems but some things are a bit tricky.
Robots.txt: In the multi site installation there is obviously only one robots.txt file. That is all right until you want to develop a new site. In that case you need for all working sites the robots.txt that comes with Drupal 7 and for the development site a robots.txt that states: User-agent: * Disallow: /
Fortunately there is a module for that: Robots.Txt This module allows you to create a robots.txt for every site in you multi site setup. Do not forget to delete the original robots.txt.
Updates: Performing updates is more or less easy, I just did some module and a core updates. Initially I was a bit worried about it but it is al a matter of following the right steps.
- Put all sites in Maintenance mode.
- Core update: replace the Drupal core with the new version.
- Go to the initial site and check if a database updates are needed, if so update the databases of all sites.
- Module(s) update: go to the initial site and update the modules through the menu modules -> update and check if Database updates are needed if yes update the databases of all sites.
- Take all sites out of the maintenance mode and you are done
Was it worth the effort, yes, for me it was, I created a series of similar web sites with it and it did save me a lot of time.
The multi site option is worthwhile if you need to create a number of web sites with a similar structure like customized sites for every location of a franchise business. For different websites with different features and themes I would not consider it, I am afraid it will make thing to complicated to maintain.
Next I am preparing a post about social links and meta tags