Work of Fiction Inspires a New Search Engine
#JAGSTER: Judiciously Arranged Global Search-Term Evaluative Ranker
“Caitlin Decter is young, pretty, feisty, a genius at math — and blind. Still, she can surf the net with the best of them, following its complex paths clearly in her mind. When a Japanese researcher develops a new signal-processing implant that might give her sight, she jumps at the chance, flying to Tokyo for the operation. But Caitlin’s brain long ago co-opted her primary visual cortex to help her navigate online. Once the implant is activated, instead of seeing reality, the landscape of the World Wide Web explodes into her consciousness, spreading out all around her in a riot of colors and shapes. While exploring this amazing realm, she discovers something — some other — lurking in the background. And it’s getting smarter …”
Robert J. Sawyer needs no introduction to science fiction readers. For everyone else, let’s just say that Hugo and Nebula awards are kind of a big deal. But it is not Robert’s forty plus awards that set him apart from the rest of the notable science fiction writers, it’s his fresh and vibrant story telling backed up by an amazing amount of technically correct detail.
I am a hard S.F. fan and cherish the works of Clarke and Asimov but Sawyer’s novel WWW: Wake brought something entirely new and fresh into my reading experience – Google PageRank.
Yes you heard me right, PageRank algorithm, Google, links, blackhat, search monopoly and open source alternatives.
I have to admit I was taken by surprise. I normally read/listen to my books to rest my brain from work and this felt like somebody ripped a hole in fibre of my fantasy world and slapped me with a cold towel of SEO.
Here’s an excerpt from the book:
“Miss Caitlin,” wheezed the familiar voice.
“Dr. Kuroda, hi!”
“I have an idea,” he said. “Do you know about Jagster?”
“Sure,” said Caitlin.
“What’s that?” asked her mom.
“It’s an open-source search-engine — a competitor for Google,” said Kuroda. “And I think it may be of use to us.”
Caitlin swiveled in her chair to face her computer and typed “jagster” into Google; not surprisingly, the first hit wasn’t Jagster itself — no need for Coke to redirect customers to Pepsi! — but rather an encyclopedia entry about it. She brought the article up on screen so her mother could read it.
From the Online Encyclopedia of Computing: Google is the de facto portal to the web, and many people feel that a for-profit corporation shouldn’t hold that role — especially one that is secretive about how it ranks search results. The first attempt to produce an open-source, accountable alternative was Wikia search, devised by the same people who had put together Wikipedia. However, by far the most successful such project to date is Jagster.
The problem is not with Google’s thoroughness, but rather with how it chooses which listings to put first. Google’s principal algorithm, at least initially, was called PageRank — a jokey name because not only did it rank pages but it had been developed by Larry Page, one of Google’s two founders. PageRank looked to see how many other pages linked to a given page, and took that as the ultimate democratic choice, giving top positioning to those that were linked to the most.
Since the vast majority of Google users look at only the ten listings provided on the first page of results, getting into the top ten is crucial for a business, and being number one is gold — and so people started trying to fool Google. Creating other sites that did little more than link back to your own site was one of several ways to fool PageRank. In response, Google developed new methods for assigning rankings to pages. And despite the company’s motto — “don’t be evil” — people couldn’t help but question just what determined who now got the top spots, especially when the difference between being number ten and number eleven might be millions of dollars in online sales.
But Google refused to divulge its new methods, and that gave rise to projects to develop free, open-source, transparent alternatives to Google: “free” meaning that there would be no way to buy a top listing (on Google, you can be listed first by paying to be a “sponsored link”); “open source” meaning anyone could look at the actual code being used and modify it if they thought they had a fairer or more efficient approach; and “transparent” meaning the whole process could be monitored and understood by anyone.
What makes Jagster different from other open-source search engines is just how transparent it is. All search engines use special software called web spiders to scoot along, jumping from one site to another, mapping out connections. That’s normally considered dreary under-the-hood stuff, but Jagster makes this raw database publicly available and constantly updates it in real-time as its spiders discover newly added, deleted, or changed pages.in the tradition of silly web acronyms (“Yahoo!” Stands for “yet another hierarchical officious oracle”), Jagster is short for “judiciously arranged global search-term evaluative ranker” — and the battle between Google and Jagster has been dubbed the “ranker rancor” by the press…
Can you guess what I did at this point? Yes, I googled “Jagster” and what I discovered was disappointing and awesome at the same time. First of all Jagster didn’t really exist. The good news is that Brad Detchevery, a programmer from Canada (who also read the book) registered JAGSTER.ORG with intention to bring Sawyer’s fictional search engine to life. Ironically he set the competing project up on Google Code.
Brad’s search engine is still a concept in need for contributors and funding. Whether this search engine ever gets developed and successfully launched is a matter of leadership, dedication, funding, technical expertise, ingenuity… and a bit of luck. So far it still sounds like ‘science fiction’ but I think it’s time another great author inspires technological advancements, and what better place than the web.
Sawyer made a note of technology very similar to the one describe in his book, being used in the UK to gauge the degree of online piracy and comments: “I make no comment here about the ethics of what’s happening the UK, but the technique of actually analyzing every packet in the datastream to determine who is looking at what is very similar to the technique I proposed for Jagster.”
Must Read: Interview with Robert J. Sawyer