In the Plex: How Google Thinks, Works, and Shapes Our Lives

Jul 5, 2011

Full Video

Global Ethics Forum TV Show

For two years, Levy was given an opportunity to observe Google's operations, development, culture, and advertising model from within the infrastructure, with full managerial cooperation. What did he find?

Introduction

JOANNE MYERS: I'm Joanne Myers, and on behalf of the Carnegie Council I'd like to thank you all for joining us this morning.

Our speaker is Steven Levy, and some of you may know him as the author of the award-winning book Hackers, which PC Magazine named the best sci-tech book written in the last 20 years. Currently, he is a senior writer at Wired magazine. Today he will be talking about Google, a verb, a noun, and an Internet company which is known for many things, including its irreverent culture and its data-driven approach to business decision-making.

Google began as a small startup in 1998 and today has more than 20,000 employees. It is the envy of many.

For anyone interested in technology, just imagine what it might be like if you were given the opportunity to spend two years embedded with Google. You would be invited to attend strategy and development meetings, interview key people, and travel abroad with future leaders of Google. And you would also be able to observe how Google was handling increasingly controversial issues, such as its involvement in China.

Well, dreams do come true, because that is exactly what happened to our speaker. For more than two years, Steven was immersed in Google's culture and corporate life in order to learn more about the inner workings of the company that has given the world its most popular search engine.

Our speaker is known for having a thick Rolodex of contacts at Google, names which he began accumulating when he was a technology reporter at Newsweek, and an acute familiarity with technology and information which he has collected over the years. All were factors which were instrumental in making the decisions to give him unprecedented access to the company while researching In the Plex.

The story of how this pioneering search engine founded by two Stanford graduates, Larry Page and Sergey Brin, changed the way billions of people not only access information but the way we now think about and receive it, is endlessly fascinating. The people who have clicked on its ads have made Google wildly profitable and turned its founders into billionaires.

What Google has accomplished is the stuff of legend, and what Steven Levy has succeeded in learning about this audacious technological and cultural marvel stands in a category all of its own.

Please join me in welcoming the person who can teach us how Google thinks, works, and shapes our lives.

Steve Levy, thank you for joining us this morning.

Remarks


STEVEN LEVY:
Thank you very much. I want to thank the Carnegie Council for having me here. Thank you all for coming out this morning and hearing about Google.

Arthur C. Clarke once said that any sufficiently advanced technology is indistinguishable from magic. The first time I put words into that little search box at Google and it went through billions of web pages and got exactly what I wanted, it felt like magic. Do you have the same experience there? Pretty much everyone here does.

For me it happened in 1998. That was the year that Google was founded. I was working at Newsweek, and I was testing out this search engine there was a little buzz about on the Internet. I was really impressed with it and wrote about it in Newsweek.

But it was the next year, 1999, when I became fascinated not only with the technology but the company itself and the effect it was having on the Internet. It wasn't just a way to get from one place to another on the Internet, but I thought it was changing the Internet itself, much in the same way the interstate highway system, when it was built as a place to get from one place to another, wound up changing geography.

What turned out to have happened was that because Google was great at taking the most obscure site and putting it within reach of the people who were interested in that subject, people felt confident to create websites, whereas previously they would have no reason to do so. It was really building up the Internet.

So I thought, this company is so important, I should just check it out.

I called up the person who was then the communications person for Google, whom I had known because she used to work at Apple, and I said, "I'd like to come down and meet these people." In October 1999 I did so.

I know it was October because when I went to the Googleplex, which is the name of their headquarters, and that name got reflected in the title of my book, everyone was in costume. It was almost Halloween.

Larry Page, one of the cofounders, was dressed like a Viking. He had a big furry vest and giant horns coming out of his head in his hat. Sergey Brin was a cow. He had a giant plastic breastplate with these big udders coming out of it. A Viking and a cow took me to a room and explained to me how the search engine worked.

I was fascinated with Google and fascinated to cover Google for Newsweek for a number of years after that. I always thought it would be great to do a book about it, but I wasn't quite sure how.

But in the year 2007, I was invited to go on an international trip with some of Google's future leaders. I spent 24/7 with these people for 16 days. We went literally around the world. We went from San Francisco to Tokyo, to Beijing, to Bangalore, to Tel Aviv. In their bubble I learned not just what Google did but how they thought. It really was a company that was totally in synch with the rhythms of the Internet there.

I realized when I was spending time with these young people who were tied to the future leaders of Google that I was being immersed in the future. I thought Gee, that's the way to do a book about Google, to do it from the inside out, and to look not just with this group of future managers but to look in the search world and the ad world with the projects and policy they did, and spend as much time with the people as possible.

As we learned here, miraculously, they gave me the permission to do it. And I did.

So what did I learn? I learned a lot about search. I learned how their search engine is an artificial intelligence learning machine, and I talk about that.

I learned a lot about ads. It's sort of a miracle in itself that every time you type something into the Google search box not only does it go into the Internet, which they have indexed in multiple data centers there to get what you want, but an auction takes place. Google is the world's biggest auction company. It runs many, many auctions every minute, even every second there, and does it in a really fascinating way, which some advertisers complain it's a black box, but it's very clever in that it's an auction where the highest bidder doesn't necessarily win. They mix that in with their guess on how good the ad is to serve the users. I could talk about that.

About considering where we are, I am going to talk about two issues here that are maybe of more interest to this great group that comes to these breakfasts here. One is privacy. The other is China, which is particularly appropriate for here.

Google had its first privacy issue, significantly, on an April Fool's Day. April Fool's is like the national holiday of Google. It's a very whimsical company. It is a mix of Montessori and an elite university in terms of the culture. They do a lot of play. If you're around Google, you'll see a lot of toys draping the hallways and big medicine balls and things like that.

But this April Fool's joke wasn't a joke. On April 1st 2004, Google introduced a product called gmail. It was groundbreaking for a couple reasons.

It was a web-based mail system, meaning the mail doesn't live on your computer, it lives in Google's data centers. Google was very successful in breaking this product out because it gave people more storage than any other similar product had offered.

It wasn't the first of its kind, but it was the first to say, "We understand the way technology is going, that storage is going to be very, very inexpensive and we're going to give you a lot of storage right now. Even though it might cost us somewhat more than we want to pay now, we can understand in a few years it will be almost free so we can afford to do this, while our competitors are still mired in the paradigm that, Gee, storage is expensive, we should withhold it." So that was one thing which was really positive for the product.

The other thing about that product which caused a lot of consternation among some people was that when you had your mail open, alongside the content of a message you would see these ads. If you used gmail, you'd see the ads. The first time people saw this, a lot of people got really upset and they wondered, "Gee, is someone at Google reading our mail?"

This actually was talked about a little when Google was developing gmail. In a way, it was a little unusual, because sometimes Google doesn't even bother to monetize a new product when it comes out with it because the ads are so successful that they just figure, "We have a new product. It will just bring more users to Google. They'll search more, and that's where we get our money."

But in this case they said, "Let's make money off the back by taking the ad inventory we have and putting relevant ads alongside the mail. If people are upset, all we have to do is explain to them that no, people aren't reading the mail. We have these algorithms that analyze your mail. No human touches your mail. It will be okay." That's the Google mentality: they think that once they have the data that answers a question, that's the end of it.

But what they failed to realize was that seeing this sort of woke people up to the idea that this company has a lot of information about us—not just in mail, but when we search they must know something about the way we search. It really opened up the perception that Google and privacy is an issue that people should keep an eye on.

They managed to beat down some of the objections that came with gmail itself. As it turned out, a California legislator was so upset that she was threatening to introduce a bill in California to ban this sort of advertising alongside web-based mail. Google had to call out one of its big guns, Al Gore, who was an advisor to Google, and they put Al in a room with this legislator. He had a whiteboard and he was diagramming how it worked. Maybe just to get him to stay quiet, she agreed to change the law and that was the end of that.

But in talking to Google's policy people, including one woman who actually recently got hired to be in charge of issues like this, a woman named Nicole Wong, who I just learned yesterday is leaving Google, which is part of, I guess, the brain drain that it is experiencing. Maybe we can talk more about that in the questions and answers period.

She said, "From that point on, we had to be conscious of privacy issues."

Google has tried to work pretty conscientiously on one hand about the privacy issues. But they always have this gap that leads them to problems.

Most recently, they had a social product called Buzz. It was a product where, based on your email contacts, a little social network would form and you could share information with them, like the way you do with Twitter or maybe share on Facebook. They tested it internally and people really liked it.

One feature that they thought was great was that your social network came up instantly instead of you having to painstakingly say, "Can I invite this person to be in my network?" Because Google figured, Hey, anyone you email with you'll probably want to have in your social network.

What they didn't realize was that when they tested it internally, there was no problem with that, because who cares? Everyone knows that people in Google communicate with each other. They failed to understand that when it was out in the wild, you don't want everyone to know who else you're talking to.

That was something that shocked Google when it came out, that people complained. Some people had some real complaints. There were people who said, "My ex-boyfriend now sees who I'm in contact with and he's part of my social network and seeing other people." They had to change that very, very quickly.

Actually, there was someone who worked for the government, and people were able to get hold of his contacts under the Freedom of Information Act and see who he was in contact with. He had to apologize for that. He got a slap on the wrist for that.

Google fixed that. But the product was so deeply hurt that that was really the end of that project of Buzz. Now we have to wait for Google's next big move in the social initiative.

I was actually able to sit on one of their privacy councils. It was fascinating because Google, like a lot of other companies, has to grapple with the big issue on the Internet: As the information becomes much more discoverable than it was before, so much was put on the network there.

A particular session of this privacy council that Google has, which pulls together executives from different areas, both in policy and some engineering, and their chief economist sits in—they look at different products and they look at the privacy implications.

The day I sat in, they were looking at a product that was a feature of Google Latitude, which tracks where you are. This does this with people's permission. They have to opt in there. This particular feature they were looking at was something that allowed people to keep a permanent history of everywhere they went.

The engineer proudly showed the way it worked and said: "Look, here's all my peregrinations from the past week." You could see the track. He left what they call "bread crumbs" everywhere he went.

One of the guys who was one of the executives of Google's privacy almost had a heart attack looking at this.

But the engineer was saying: "Wait a minute. People want to do this. This is something that people can opt into."

It was a sort of a generational thing going on within this discussion. They ultimately decided that they would okay the project but they would give numerous warnings. Before you signed up for it, several things would come up. You'd have to click, "Yes, I want to do this," "Yes, I want to do this," and every couple of weeks you'd get another one saying "You know we are tracking you; you know this, don't you?"—just to keep that up.

It was interesting because when they came out with that product, there was no squall, no one complained about it.

Larry Page once told me, "It's impossible for us to predict which products are going to cause some sort of privacy flap. Sometimes the worst ones really don't and sometimes ones which have negligible privacy implications turn out to be a big flap there." So Google is still trying to figure that out.

The other thing I want to talk about—and I spend a lot of time on this in the book and visit it several times—is China.

Google was started very consciously by two people who wanted to do good. They felt that giving people access to all the information on the Internet much faster than you could before was a good thing for humanity. They wanted to bring the world's knowledge. That's their mission, really, to make all the world's knowledge organized and accessible to all. They have this other unofficial motto, which is, "Don't be evil." Sometimes the two come into conflict there.

One thing that really tested their ethical mettle was their incursion into China. China is sort of irresistible for any technology company. It's the biggest Internet market, it's the fastest growing, and it's the sexiest. You want to be in China.

Google really wanted to go there. But it had a very big hurdle before it went in there, which was that in order to have its search engine in China, it had to agree to have its results censored.

Now, think about this. Here's a company that spends every ounce of effort to get the information before your eyes that you most want to see, and here it is being asked to withhold the information, sometimes the most valuable information that you can get, in the service of an oppressive government that wanted to deny freedom to its citizens.

This was especially painful for Google's cofounder Sergey Brin, who as a young child came over to the United States from Russia. His family was escaping oppression. This is very personal to him.

Yet, Google did this. They went into China. So why did they do it?

It turns out that they felt that they would do good by being in China. They constructed a moral spreadsheet, which is the way I describe it. In this spreadsheet—when you do a big spreadsheet, sometimes things are negative, sometimes things are positive; you calculate it.

The big red cell, really, was that we had to censor; that's a big negative there.

Then there were a lot of black cells, positive things in the spreadsheet. They figured, We'll bring information into China, and that will help educate the people and they will want more freedom then. We would also tell them when we were censoring something, and that would urge them to push back against their government and say, "We want more freedom to see what we want to see." They felt that by being in China Google would change China.

But actually China changed Google. There were a lot of problems, even after Google made this initial compromise.

One thing was that it turned out that the playing field of Google versus the competitors in China—the main competitor is a search engine called Baidu—was tilted against them. Sometimes Google was even blocked and their traffic would be sent to Baidu.

Baidu very cleverly positioned itself as a patriotic alternative to Google there. There's this commercial, and you can actually see this on YouTube, that ran in China where Baidu was represented by sort of a wise man. This took place in a rural Chinese village. There was a knowledge showdown between this wise man dressed in traditional garb and someone representing Google who was dressed like Abraham Lincoln—he had a top hat, he had a Lincoln beard, and, oddly, he has an Asian bride alongside him. This duel took place in this rural town.

Someone would throw up an item of knowledge and each contestant would give what it meant there. The wise Chinese people knew it all; he just would reel it off. But the Google person, he would mispronounce words, he would get it wrong. The villages in each iteration would start taunting the Abe Lincoln guy until finally the bride bolted from his side, joined the Chinese man, and Abe Lincoln just stumbled off literally vomiting blood. I'm not making this up. You can check it out on YouTube.

Even more serious was that the demands of the Chinese government didn't stop with the initial censorship requirement there. China kept asking more and more as it went on.

At one point, it even asked Google to censor not only the search engine taking place inside China, the CN domain, but it asked it to censor worldwide any search in the Chinese language. So if you were searching in New York City for Tiananmen Square, if Google had accepted this restriction, here from your city, New York City, Tiananmen Square, you would see nothing of the protests; you would just see happy tourists walking around and bureaucrats. This was unacceptable to Google.

There were a number of other problems taking place. Some of them were cultural gaffes on Google's part, and then some cautions that Google took which created some problems among its work force in China. It tried to build a work force of the smartest young engineers it could, not only work to on products inside China, but products globally. But, because of its concerns about security, Google denied those engineers working in Beijing and Shanghai access to its production code, the information that they would need to develop Google products. Google engineers nowhere else in the world have that restriction. The Chinese engineers felt they were second-class citizens, and that caused a lot of tensions.

Those tensions got worse when I discovered Google's government relations person had given iPods to Chinese officials as gifts one Christmas, which violated regulations. Google had to fire her. Then they had to bring in a team of white-collar investigators to look into this. The people in China, who were dumbfounded this person got fired for this, felt that they were being treated like criminals there.

So the morale in China was deteriorating. But, even worse, Google felt that the demands were getting increasingly onerous and they couldn't resist them. They felt things would ease up after the Olympics. They really didn't ease up. They got worse as things went on.

Google got cited. They had a project called Google Suggest, which actually is now worldwide. You can just type in something and Google completes the query for you there. But when they did that in China, which was going to be a big boost because it's more difficult to type in Chinese characters, unfortunately sometimes the things suggested were pornography sites. Even though other search engines in China had similar problems there, China singled out Google for punishment. They even called in the head of the Google China operation, Kai-Fu Lee, and they forced him to watch all the horrible things that came up during Google Suggest. They punished Google. It was known as spanking Google when that came out.

So during this period I learned that a group of executives in Mountain View, Google's headquarters, were secretly working to pull back from their effort in China, to say, "We can't do this. We were never comfortable with censorship and we should leave."

Things were already coming to a head at the end of 2009 when Google was attacked by dark-side hackers, who were almost certainly in the employ of or supported in some way by the Chinese government.

Some of Google's intellectual property was stolen. But, even worse, this incursion was used to get into the offshore gmail accounts of Chinese dissidents, and China was able to read their mail there.

Again, this was terrible for Sergey Brin, considering his background. He really led the discussions that took place during the Christmas holiday of 2009 and into early 2010, which led to Google announcing that it would no longer censor in China, which effectively meant that it was pulling the search engine from China because, as inevitably happened, China would not let Google operate without censorship.

They eventually had to move it offshore to Hong Kong, where they could deliver it without censorship. Now the Chinese government does the censoring itself, and sometimes access from within China to this search engine in Hong Kong is sporadic.

It turned out to be an amazing object lesson. But what was interesting was that when Google finally made the decision, it energized the whole work force. I talked to people at Google who told me that everyone in the company remembers where they were when Google's executives put up that blog item saying that "We're not going to do this anymore."

It was something that came up. They didn't tell anyone in China that they were going to do this. It just came seemingly out of the blue to the people there. Of course, very few people in the company knew about it.

But it actually fortified the company's moral stance, which is important, because this "Don't be evil" thing is mocked quite often by people. Google is a very big company. When the "Don't be evil" slogan came up, it was during a meeting, in 2003, if I recall right. People were trying to say, "What are Google's values?" One of the engineers who was invited to this meeting said, "We could just sum it all up by just saying 'Don't be evil.'"

The human relations person who was running that meeting said, "Isn't that sort of negative? I don't know about that."

Another engineer at the meeting—it's always the engineers who are pushing this stuff—started writing on whiteboards all around Google—and Google's full of whiteboards because people can have an idea at any second and want to write it down—he wrote, and he has a very distinctive script: "Don't be evil, don't be evil, don't be evil." And it caught on.

It became a very useful thing, people tell me, for when Google would think "Should we do this? Should we do that?" someone would say, "Well, wouldn't that be evil?"

Some people think "evil" means don't be like Microsoft. But a lot of people felt it is a binary thing, that you could just feel in your gut, like that definition of obscenity: It's evil, it's not evil; let's not do it if it feels evil there.

You would think, now that things are so much more complex with Google, and a lot of what seemed black-and-white in the early days now becomes a little fuzzier—Google's involved in very big issues.

Every time they buy a big company, questions come up. They bought recently a travel company with a lot of data that's used by some of Google's competitors. Bing uses as a backend some of this travel information. People wondered, "Isn't that a little evil, that Google would have the information that its competitor needs?" For Google, they felt, "No, we know we're not going to do anything bad so it's okay to do it." To a lot of people that isn't a very good explanation there. And things get fuzzier.

But Google still feels that that's a useful metric, so to speak, of what's good and what's evil, even though its communications people rue the day that "Don't be evil" came up, because people always use it as a bludgeon. Everything Google does, people say, "Well, what happened to 'Don't be evil'?"

But what I found in documenting Google during a fascinating time, between 2008 and 2010 and early 2011 when I was doing the book, Google was in transition from a David to a Goliath. It was coming to grips with the fact that it was a big company. Even though it tried very hard to think like a small company and its survival really depends on it thinking like a small company, because we know from the history of technology that a company which owns a paradigm, like Google really is the key company of the Internet, is, almost inevitably going to be toppled by a company that isn't wedded to that paradigm, that's going to come up with the next new thing and do it better than the company that currently does it.

So you can see, it's gone from Microsoft to Google, and now some people say to Facebook. And where's Google? Google is desperate to think like a small company, to be there when the next thing comes about there. So that denial of gravity, which is ironic because this is the ultimate data-driven company, really is key to Google's survival.

That's the big challenge for Larry Page, who was the original CEO of Google in its very early days. Then, to bring in "adult" supervision, they brought in Eric Schmidt in 2001. He did a great job for the company for ten years. Now Larry is in charge.

Larry is a very ambitious person. He believes that we live in a time where technology has made it possible to do what most people were brought up to think is impossible. So when he comes up with an idea, as he did in graduate school, to say, "What if we scan every book ever printed, put it all in an index so in seconds you could find out what's in the contents of any book ever written?"

I grew up thinking, "That's sort of like a pipe dream; that's pretty hard to do." Larry's impulse was "Let's get in a room, we'll scan a book, we'll time how long it takes, we'll calculate how much it will cost to scan each book, we'll make a deal, and we'll figure out how it can be done. Hey, we can do this! Here's the proof. Let's go and do it."

And before people knew what happened, before the authors and publishers knew what happened, Google had a deal with libraries and started doing it. It ran into some trouble along the way, but Google takes on these moon shots.

Larry is very disappointed when people don't apply enough ambition, particularly Google employees, to taking on big problems because he feels that "We're in a position to solve those problems better now than we ever have in the history of humanity."

I think with that kind of thought Google—I can't say that it will—but it will at least has a shot at being there when that next paradigm comes along.

Thank you very much for listening. I'm delighted to take questions.

Questions and Answers

QUESTION: Hello. I'm Ruth Stevens.

I found it unusual that a company, especially a large company like Google, would open its kimono so widely to a journalist. I'm curious about the terms that they set on your engagement over those two years, whether you were constrained, for example, from writing a story the next morning about some juicy tidbit you had head in a meeting the day before.

STEVEN LEVY:
There are a couple of reasons why they let me in.

One was I had been writing about the company from a very early time. People were more accessible in those early days. You could spend much more time, particularly with the founders, who as the years went on pulled away from the press. It got harder and harder to interview them. But there had been an early trust and they felt that from the stuff I had done for Newsweek that I sort of got the company.

But from more Google's self-interest, they understood that the company was being seen as a company which was obscure, a black box, and that was not going to stand it in good stead in this period it was going into, when it was going to be under increased scrutiny by regulators and others.

They felt that "Well, if we let this guy in and he could talk about us, maybe it could answer some questions and people would feel at ease, even if he would inevitably expose a wart or two during the course of the research."

But my ground rules were pretty much everything was embargoed. So I couldn't, if I found something interesting, write about it the next day on my own. So the understanding was everything was on the record, but on the record until the book came out.

The one exception to that was if—I was following a number of projects that would lead to products that weren't out yet, but if a product wasn't out by the time my book came out, I agreed that I wouldn't be announcing it in my book.

Actually, there was one thing in particular that I couldn't write about for that reason there.

Also, there were a couple of exceptions where Google said to me, "There's something coming out, you know about it, maybe it would be cool if you wrote about it for Wired." I did a couple of things for Wired, one on their chrome browser coming out that I knew about early on.

Then there were a couple of stories that I worked with Google on. They felt specifically to explain their ad system and their search system they had to be more open right then. So I did a piece on each of those things for Wired as I did it.

QUESTION: Susan Gitelson.

This is just fascinating. Could you expand further on what is going on internationally? You gave us the very vivid example of China. But here you mentioned that you went on an international tour. Google is expanding in many places. What are their experiences in other countries, particularly, let's say, for Sergey Brin, with the former Soviet Union, or now we have the Arab countries, the Arab Spring and so forth. What are their experiences in helping people to communicate freely with each other within the country?

You haven't had a chance to tell us how many people are currently employed by Google and the effect this is having in these other countries. For example, I just recently was taken on a tour of Google's New York headquarters. We were told they get breakfast, lunch, and dinner. So there's every encouragement for them to remain in their own social group.

In Silicon Valley, there are all these empires and something similar is going on. To what extent do these brilliant, nerdy engineers interact with other people and become part of normal society?

STEVEN LEVY: Great questions there.

As you say, Google has a policy of free food for all the people there. At the time that it got figured, I was told it was like $17 a day per employee at that point, which when you think about it, if they're going to spend even an hour more time on the campus from eating the meals there, that's pretty good because those engineers get paid a lot more than $17 an hour.

Google also feels that it's good for people within the company to interact with each other. So you might sit in one of their cafes. They have 18 cafes just on the main campus, and they have probably three or four in New York now.

They bought that big building in Chelsea, which is the second biggest office building in New York City. They had a couple of billion dollars sitting in cash. It was the one big purchase they made in recent years that didn't need Department of Justice scrutiny, so that was good.

But in terms of the other countries, it's interesting. In every country Google's in, it almost seems there's a unique problem they have there. In Turkey you can't say something about the king. In France you have to watch out about Holocaust denial; that's illegal there. Then, in India there's geographic problems, in terms of Google maps. In Germany, street view. So every single country has a different kind of problem.

When I traveled with the young managers, they brought in users and watched them interact with Google products and they interviewed them about how things worked to try to get a sense of it.

But there is a problem in terms of getting out. Now, they try to hire a lot of local people.

One reason why Google has engineering centers in a number of places is because people from different countries sometimes don't want to come or can't come to Silicon Valley and they want to be able to have them work where they live. So Google has a very international work force. In a sense, these people are in many cases of the culture. They expose the people who come from the U.S. and work there to other cultures.

But in terms of the insularity, it's interesting. Even in New York City, where it's much easier to get in and out to eat lunch—where in Mountain View you have to get in the car and go to Palo Alto, for instance, to get something to eat—people are jamming the cafeterias there.

They'll try to push people to learn about where they are in the different countries. Their success depends on it because they have to be sensitive to something in every different country.

QUESTION: Good morning. Philip Schlussel, United Coverage.

Is there any reason for any of us to use a search engine other than Google?

STEVEN LEVY: Yes. Google has a lot of competitors. Some of them have found niches for certain what are called verticals. In specific areas of search you might do a better job. Google will always say, "Well, we have to beef up that."

Recently Google has beefed up, say, the medical search, because other places have come up with search engines that specifically focus on high-quality medical sites.

By being the dominant search engine, Google becomes the biggest target for people trying to game the search engine, which brings down the quality of its results. So Google is based in this arms race there.

It's very good really that they do have a credible competitor in Bing, which has merged Microsoft search with the Yahoo! search population. They have almost 30 percent now. That not only keeps Google honest, but it makes it harder for the people trying to game search engines to say, "Hey, if we focus just on trying to trick Google, we're going to be okay."

So for certain kinds of things—there's specialized travel search engines, for instance, that sometimes give better results than Google. For certain kinds of things there are ways that you might get a better result than on Google.

QUESTION: Carol Spomer. Two questions.

One, it seems that any intelligence agency in the world would love to be able to get their data. How are they dealing with that, where they can be doing good if it's not doing evil?

Number two, in terms of vision, you talk about the Goliath type of situation, that Facebook and others could be encroaching on their business. Who are they perceiving as their main competitors? Where do they see the greatest threats short term and longer term down the road?

STEVEN LEVY: Google has to be vigilant really about protecting the privacy of its users. It does not want to be seen to use personal information as a resource.

Some years ago, the government requested a lot of information about the way people search, not only from Google but from Yahoo!, Microsoft, and AOL, which had a big search operation at that time. Google was the only one which pushed back against the subpoena, which they called a fishing expedition, and managed to get the government's requests whittled down to something less voluminous.

So, whereas Google wants to be a good citizen when it comes to a subpoena, and they certainly cooperate in national security situations, they have to be careful because Google wants you to feel comfortable using it. If you don't, you're not going to use it as much.

It's going in the direction where Google is going to want more involvement in your life. So not just mail, but they want you to put all your documents on the Google ecosystem there. Google's new computer, the Chrome bookmarks, doesn't have a hard drive. All the information really lives in the cloud and, presumably, the bulk of it on Google's data centers. So you have to feel pretty comfortable with Google to enter into that sort of relationship with them.

The second part of your question was Google has competitors. It is an ambitious company that interprets their mission very broadly.

You wouldn't think that mobile phones are necessarily a way to serve the world's information, but as it turns out, since very soon, if not already, more searches will be conducted from these things we carry around in our pockets than we do on our desktops and laptops, it's an essential area for Google to be in. So in that certainly Apple is a big competitor.

Certainly in search, Bing Microsoft is a competitor, Baidu in China, and Yandex in Russia.

But overall if you were to ask the one rival now which Google is focused on, it would have to be Facebook. People share a lot of important information with Facebook. You tell Facebook who you're in contact with, who your friends are and your contacts; you tell it what your interests are; you tell it where you're going. This is information that Google would really like to have to improve its searches and to be valuable to you in a number of ways.

But Facebook doesn't share this information. Google is terrified that a kind of little Internet in itself, a Facebook Internet, is going to grow and that Google can't get access to it, and Google's quality is going to erode if it doesn't. Google feels that it has to figure out a way to get that kind of information about people. So they have a very big social effort themselves that they're working on.

QUESTION:
Matthew Olsen.

I'd like you please to expand on the privacy issue. The Wall Street Journal reported that, among other companies, Droid is reporting your location from your cell phone relatively frequently and that Google is retaining that information, as is Apple and some of the other competitors. Why are they gathering this information and what are they doing with it?

STEVEN LEVY: The explanation for why Google and Apple gather this information is that it's an important aspect of the business model of these phones, particularly for Google, which doesn't charge for its operating system. Neither the handset manufacturers nor the carriers pay Google any money. Google gets its payback from people searching more and the advertising they are going to serve from those phones. So it's important to know where you are in doing that.

Google does ask you in advance before you turn on the location things like "Is it okay if we turn on location?" Of course the phones aren't as useful without it. So what are you going to do? Are you going to say "No" and not get the value of the phone? Generally, people say "Yes" and they expect that Google isn't tracking where they are at any given minute.

Google says—and I believe them—that they're not really saying, "Steven Levy is at 64th Street right now. Let's watch for Steven Levy."

One thing that they want to do is they want to get better and quicker information about where your location is. In order to do this, they have to go beyond the GPS and keep track—it turns out a good way to do that is to keep track of WiFi hotspots, of where they are. By doing that, they keep track of where you are and the thing is constantly looking for spots to report where you are. So, in a way, they're asking you to contribute to Google's general knowledge of how to find people and where these hotspots are.

A similar thing is what they want to do in traffic. As you drive along, they want to keep track of where the other phones are. They could figure out the density there and figure out to report to you if you're commuting what routes are going to be more crowded. You give up a little bit of your location, awareness of your privacy, for the greater good there.

I don't' think they do a good job of explaining how that works there or what that equation really is. It's a little complicated. But they owe it to people to explain precisely what they are doing at every turn. Google isn't transparent enough about that.

Recently, they were cited for collecting information from these WiFi spots. They didn't just take note of "Oh, there's a WiFi spot here with this name on it, there's a WiFi spot there." They were actually sucking up the information that was being passed around in those WiFi spots. These were ones that weren't protected by a password. But nonetheless, what was Google doing getting that information?

They say, "That was a big mistake. We didn't really mean to take that."

I've asked them any which way, but I have not been able to get an explanation about how they failed to notice that this data was piling up somewhere in there.

Google's instinct is to be quiet about certain things. It really has to be totally transparent in this when it goes down that step.

There are good reasons why collective knowledge of where people are, if it's anonymized, for things like traffic and other reasons, could be good for all of us. But we should all have the opportunity to opt-in or opt-out of what's going on, and we should know that in advance before we collect it there.

It's something that Google talks the talk on this but doesn't totally walk the walk. I think the same about Apple on this, that they haven't been as forthcoming as they could have been. When you go down this path you've got to be open, you've got to let people know exactly what's going on.

QUESTION: I'm interested in the bad guys and in the hackers. Could you talk a little bit about the people who are hacking into things, and is it getting more sophisticated?

STEVEN LEVY:
Yes. It's a huge problem.

I wrote a book called Hackers, but it wasn't about the people that you talk about. They were hacking in a positive sense, in creating and finding different uses for the computer. The people I wrote about generally predated this era of breaking in and in some way vandalizing other people's systems or stealing. There are huge crime operations going on there. We have a giant problem.

In the recent Sony break-ins—Sony is a company that its Playstation network of millions of people has been felled by continued attacks on hackers. They can't stop them.

One of the executives said as part of the explanation of what they were doing, was everything they could do. It was something basically to the effect—I don't have the exact quote—"Really don't laugh at us. You're going to be in the same situation too." He was talking to other people in charge of protecting databases and valuable information there.

We have seen recently Citibank get attacked. How do they get attacked? People just fiddled with the URL. They just put numbers there and they got information about accounts there.

As it turns out, this stuff is really hard to secure, much harder than it is to break in there.

There's a constant tradeoff between ease of use for people—this technology is so broadly used that the people who offer it to us feel that if they make it hard to use we're going to go somewhere else. But making it easy to use makes it more vulnerable. What we have done is we have gone far along the path to ease of use, past the point where security is being taken care of.

After Google got broken into in China, an interesting thing happened. They went into what was called corporate lock-down. It used to be when you would be talking to someone at Google they'd say, "Hey, let me show your new product there. This is all under NDA [non-disclosure agreement]. You can't write about it for a few days until it's out." They'd pull out their computer and they'd log in and they'd start typing.

Now they pull out their computer and before they log into Google they have to pull out their phone and call in, get a number of the phone, and then type that number that they get on the phone onto their computer in order to log into the Google system. We're all going to be doing that soon.

QUESTION:
Don Simmons.

Health care is such a huge part of our economy and the world's economy now. I'm just wondering what you might know about Google's plans for entry into, for example, individual patient recordkeeping, with its privacy issues, or analysis of public health initiatives, or matching symptoms to obscure diagnoses, any of those areas.

STEVEN LEVY: Google has been working for years on a project called Google Health. It has been sort of a bumpy road.

Literally, when I went on this trip in 2008, the leader of the trip, Marissa Mayer, who is a Google executive, had to join the trip a couple days late because she had to replace the person in charge of that project. That was three years ago. They are still trying to figure that one out.

They had a deal with the Cleveland Clinic. It's a very hard problem to take from the consumer side. What they wanted to do was let people have access to their own health records and get at the problem that way.

But then, as it turns out, you have to go literally on the doctor level to get a sign-in on that. There's obviously giant security problems there. There's HIPAA [Health Insurance Portability and Accountability Act] regulations which make it particularly difficult to get things done. So that project, with fits and starts, is still in flux.

In other senses, Google has managed to use its strengths to do some good things. They have this flu tracker that uses the data from searches all around to be able to actually track the progress of flu around the world. It turns out to be a great way to use the data that they normally have to be able to track the movement of flu around the world. So that's something they have been able to do, and they want to do more along that direction.

QUESTION:
Howard Lentner.

I wonder if you would comment on a concern that has been expressed recently that the Google searches are producing individual results. That is to say that if two different people put in the same terms in the search engine, they get different results, and that this contributes to this kind of echo chamber effect in which people only talk to people who are like minded and to the erosion of a kind of public discourse that everyone would share.

STEVEN LEVY:
Right. There is a recent book about that, The Filter Bubble, that expresses that concern.

So personalization now—it used to be that Google searches would only be personalized if you were signed into Google. When you go to Google, sometimes, if you are a member of gmail or things like that, you'll sign in. In that case, Google knows exactly who you are and they know your identity there. Google wants to know more and more about who you are as it fights Facebook now. They have expanded their profiles, for instance, to know that.

The search results do reflect that. First, it was only if people were signed in. Now it's everyone, because Google knows when you visited it before by a cookie that lives in your computer, which you can get rid of.

Also, there's a way that you could just take out the personalization results. It's sort of a geeky way to do it, but you can get rid of part of the thing that comes up in the address bar and see what it would be like if it wasn't personalized.

Depending on what kind of search it is, the search might be significantly different or not so much different there than what you're looking for. It's going to get more complicated as Google gets more into a new product they have called Social Search, which takes a look at what people in your social world—people you might follow on Twitter; they'd love to get the people you follow on Facebook.

The people you communicate with on gmail, they don't share that with other people, they learned their lesson from Buzz. If I follow you on Twitter and you've indicated—the other part of it is you can indicate that you like a search result—then that would show up higher on my results. They're just starting to test whether that is more effective in terms of pleasing me there.

What Google really wants to know is not so much the larger idea of does everyone get the same search results, but "Are we going to please this user with the results in this particular search?" That's what they're measuring for there. So it's sort of an orthogonal issue there. If Google can please you from giving personalized results, then that's it for Google, that's the end of that, and that other issue goes by the wayside.

It's good to be aware of the degree to which things are personalized, that if you wanted to get a broader spectrum of results, you could look for it. But if you're seeking that, you're going to click on the results that are interesting to you.

So basically it comes back to the user. If you only want to see a narrow range of results and those are the things you click on and those are the things that make you happy, Google's going to give those to you. They're not going to say, "If you want to read the progressive side of an issue, if those are the things you want to read, rather than give you the opposite, the conservative side, which you'll never click on, we'll give you the stuff that we think you're going to click on."

JOANNE MYERS:
Thank you so much for taking us inside the Google Plex.

You may also like

NOV 21, 2024 Article

A New International Order Is Emerging, We Must Bring Our Principles With Us

On the heels of a new international order, Carnegie Council will continue to champion the vision of peace and cooperation that remains our mission.

NOV 13, 2024 Article

An Ethical Grey Zone: AI Agents in Political Deliberations

As adoption of agentic AI increases, it is critical for researchers and policymakers to agree on ethical principles to inform governance of this emerging technology.

OCT 24, 2024 Article

Artificial Intelligence and Election Integrity in 2024

This final project from the first CEF cohort discusses the effects of AI on election integrity as billions of people go to the polls in 2024.

No traducido

Este contenido aún no ha sido traducido a su idioma. Puede solicitar una traducción haciendo clic en el botón de abajo.

Solicitar traducción