The dynamic, scalable and user friendly features of PHP are direct contributors to the language’s popularity. However a common problem that developers face is Search Engine Optimization. If the scripts are not optimally constructed, search engines will not pick them up. There are numerous reasons as to why this happens and some of them, along with solutions are described below.
Latency – PHP code takes time to execute, so there is a possibility that search engine spiders could give up waiting for the page and move on. You can reduce the wait time by being frugal with SELECT * calls. For example, avoid using this command on table that has ten fields. Name the columns and (assuming you use MySQL) use EXPLAIN to test queries. If you use loops, reduce the repetition of duplicated code.
Session ID – When the “enable-trans-sid” option is set to ON, the links that are created with session ID numbers become longer and longer. This is not good when it comes to search engine indexing. In your “php.ini”, disable this feature by setting “session.use_trans_sid” to false.
Friendly URLs – There are two ways in which you can achieve a static page effect which will help with search engines. Apache can be used to fake static page URLs. You can also reduce the number of GET variables. These variables produce URLs which are almost incoherent to spiders. If you really need to use many variables then you should combine them by using delimiters or even unused characters. You can eventually split them in the target page.
In addition to this, you can also use the mod_rewrite rule, but this should be done with careful consideration and only after understanding its function thoroughly.
Following these suggestions should help your pages become more search engine friendly, however it might be useful to set up a secondary “dummy” site and test these out before you implement it on your main site. This will help you debug the code without annoying your visitors.