Frontend web technologies (notably Javascript with XML, CMS, etc) is gaining momentum across the web even for common and personal websites (not just well-ranked business websites).
It is well-known that the GoogleBot zips through the web and scans and evaluates HTML, text and links to rank websites using quality scores and apply penalties where needed. In a recent article in Forbes (Velocity) written by Taylor Buley ( a staff writer and editorial developer ), it was confirmed by a Google spokesperson that the GoogleBot can also interpret code (not just parse the code), including Javascript within websites and web applications. This was previously thought to be untrue within many circles of the web design and SEO community.
For awhile now, web developers and SEO experts have noticed that Google has been assembling links that could not be recognized without the capability of executing Javascript code--links that are put together on-the-fly. With the release of Caffeine ( an overhauled indexing algorithm ) in June, the Javascript capabilities could have been part of the mix. This will certainly help with the identifying spam and security threats within the Google, as well as allowing Google to develop a deeper understanding of website advertisement relationships and traffic generating websites.