It seems google is not indexing pages if there are some html errors in the source code.
I found it easy to include some full pages (written with some wysiwyg authoring tools) with php, the problem is it's including the html tags twice, and google doesn't seem to like it.
Have you ever heard of this ?
Well, it's a well known fact that if you either have dynamic URL's (ie. yoursite.net/script.php?id=asdfg&blah), or if you have php Sessions appended to your URL's (ie. yoursite.net/index.php?PHPSESSID=12345678910), the Googlebot has a cow and doesn't index your pages properly... I don't think Google particularily cares if you have poor HTML though.
google indexes dynamic pages!!
generally speaking; robots can encounter problems during crawling your site if there are more than 1 parameter in the url. try to shorten url or prepare google site map which can help robot to properly index all your pages (or at least main site sections)
as to the wrong html; if you use php to include complete html pages, why don't you remove duplicated html tags (I suppose you mean <body><head> and so on) using php?