Public Parts: How Sharing in the Digital Age Improves the Way We Work and Live
Public Parts: How Sharing in the Digital Age Improves the Way We Work and Live

Public Parts: How Sharing in the Digital Age Improves the Way We Work and Live

Jan 30, 2012

Well-known blogger Jeff Jarvis celebrates what he calls the "emerging age of publicness," arguing that  anything we have to fear in this new networked world is overwhelmingly outweighed by all the good that will come from it.

Introduction

JOANNE MYERS: Good morning, and welcome to the Carnegie Council. On a day of a web-wide protest over two legislative bills to stop online piracy, which many techies fear could restrict its freedom to publish, we are indeed delighted to welcome Jeff Jarvis to our program.

Professor Jarvis is a well-known blogger [http://BuzzMachine.com] who has definite opinions about many subjects, but none more so than an optimistic belief in the possibilities of new media. He will be discussing his somewhat controversial new book, Public Parts: How Sharing in the Digital Age Improves the Way We Work. In it, he argues for openness on the web and asks such critical questions as: What are the benefits and dangers of living a life in which everything is shared? How do we define what is public and what is private?

When the printing press was invented several centuries ago, it was anticipated that everyone would embrace this new invention and, with it, the availability of information. Yet many found this new way of communicating to be frightening. Today, with the advancement of new tools such as Facebook and Twitter, sharing information has become a lot easier than in the days of Gutenberg. Even so, just as in those earlier days, today, with so many advanced ways of communicating, many are resisting these new innovations, with a major concern being a loss of privacy.

In Public Parts, Professor Jarvis examines what he calls "the emerging age of publicness," a time of epochal change that is bound to affect life as we know it, altering the ways we do business, how we create identity, and how we interact with others. He interviews such well-known luminaries as Facebook's Mark Zuckerberg, Google's Eric Schmidt, and Twitter's Evan Williams, and asks them to shed light on this new industry based on sharing information.

In outlining the relationship that privacy has to publicness, Professor Jarvis lays out the major concerns of those critics who see the Internet as a threat to their private lives. He explains why their worries are overblown. As he distinguishes between privacy—or what he calls "the ethic of knowing"—which applies to the recipient of private information, and publicness—or the ethic of sharing—which applies to the originator of the information, he provides powerful challenges to existing definitions of privacy. In the end, Professor Jarvis sees the advantages and argues that anything we have to fear in this new networked world is overwhelmingly outweighed by all the good that will come from it.

Whether you agree or disagree with the ever-increasing public nature of life online, I believe you'll find Professor Jarvis's argument interesting, raising questions that we should all be thinking about. But before we can discuss his views, we need to know more about his position and how he envisions the future.

To do so, please join me in giving a warm welcome to our guest this morning, Jeff Jarvis.

Remarks

JEFF JARVIS: Good morning. Thank you very much for being here.

It is, in fact, an amazingly auspicious time to be discussing this today. I'm sure that you all go to Wikipedia in the morning first time and see that it is blacked out, and Google has blacked out its logo. This is a war over SOPA [Stop Online Piracy Act]—we can discuss this in more detail later—a bill to protect the entertainment industry against what they see as a threat of piracy, but in the process, to change the very architecture of the net as what I call a tool of publicness.

That's where the danger is. That's why the geeks around you are all so alarmed and are having fits today. We can talk about that more in a bit, because it does really go to my entire point in doing this: I sense this new age of publicness, this ability to be public, this ability to share that people are taking on, and I want to protect the potential for what that can bring us.

To do that, I kind of had to get through a gauntlet of considering privacy as well. Privacy and publicness are not at odds with each other. It's not binary. It's not either/or. But clearly we have a choice when we know something whether to keep that private or to make that public. These are choices that we make, and they have an impact upon each other. But publicness depends upon privacy. You have private thoughts that you choose to make public. You hear things in public that affect your private thoughts. You have a private self.

I want to emphasize that privacy is very important. I believe in privacy. I have a private life. Privacy needs protection. But the fear I have is that these days we are discussing privacy so maniacally, with such fear, that if we readjust our world around those kinds of controls, we lose the opportunities that come from publicness, from the ability we all have, now that we all have a Gutenberg press in all of our pockets.

So privacy: I have found that fear about privacy is often tied to new technologies—the Gutenberg press. When the press came out, various authors at the time did not like the idea of having their ideas set down with their names permanently and spread widely. Jonathan Swift at the time—and I'll mangle the quote—said that a book of verses kept in a drawer and shown to a few friends is like a virgin much sought after, but a book once published is like a common whore anyone can buy for two crowns.

Fast-forward: The first serious discussion of a legal right to privacy in the United States did not come until the year 1890, which amazed me. The lawyers in the room will know all this, but I didn't. That came from a technology. It was born out of the invention of the portable Kodak camera and the fear that suddenly anyone could take your picture anywhere and it would appear in this penny press that was growing.

What did that mean? There were some efforts at legislation at the time to require anyone who took your picture anywhere to get your permission to do so, because this was your privacy. The New York Times at the time has great stories about "fiendish Kodakers lying in wait." President Teddy Roosevelt banned "Kodaking," as it was called, from Washington parks for a while.

This fear of technology is about change. A change comes in, and we have not yet adapted our societal norms to understand what to do. Is it right or wrong to take someone's picture and publish it? Is it right or wrong to pull out your phone and start tweeting in the middle of a conversation? These are norms we're trying to adjust to as society changes.

There are many other technologies along the way—small microphones, small video cameras that are everywhere, RFID [radio-frequency identification] chips that are in your pants at Walmart (though I don't think any of you shop there), and so on. These all lead to fears, because we don't know what the controls are on them yet. We don't know what the proper behavior is of them yet. But that doesn't make them all necessarily bad; neither does it make them all necessarily good. They are what they are. They're technology. They can be used for good and they can be used for bad.

I then tried to figure out the definition of privacy, which I thought would be easy and it turned out to be very, very, very difficult. I kind of thought it was about controlling your information, something like that. But the definitions of privacy are many, many, many. Some have said that it's kind of an empty-vessel word into which people put their concerns. I won't go through all the definitions that I went through, but it was enlightening to me to see the difficulty there is in defining privacy.

Brandeis and Warren, who wrote that 1890 law review article that became the basis of the discussion of a legal right to privacy in the United States, came down to the idea that it was the right to be let alone. They also came down to the idea that it was about feelings, which, said wrongly, could sound like it's putting it down, diminishing it. But it's not.

If you think about what privacy really comes down to, it's, in great measure, our fear of what other people may think or say about us and/or do to us based on that. The insidious thing about it is that we don't necessarily know what they are thinking and they haven't necessarily said it. We are just fearing it. So privacy is about feelings. It is about worrying about that. And it is a legitimate concern.

But, as I said, I didn't want to write a book about privacy. I wanted to write a book about publicness. So having gotten through that gauntlet—some would say not satisfactorily, but I got where I got—I then wanted to look at publicness and the benefits of publicness. I really saw this at first at a personal level.

I have been blogging since 2001. I was at the World Trade Center on the last PATH train in, and a week later, I'd decided I would blog for a few weeks to kind of get it out of my system. It changed my life, it changed my view of media, it changed my career, to realize that media properly is a conversation. I would write things, people elsewhere would see them, they would write about them, and they would link back to me and I would link to them. That link created conversations that occurred in different places at different times in public.

I realized that media is not so much about making a product; it's about making a conversation, about enabling a conversation. That was a huge revelation to me in my career as a journalist. I teach journalism school now.

So I blogged. I blogged a lot. I blogged about my life and my opinions and so on. So I was used to being public. Then a couple of years ago, I got prostate cancer, and so I blogged about that, which might sound insane. It's the most personal of data you could imagine, talking about one's private parts made public. But I found tremendous benefit in this.

When I did, people came forward to me. I had friends who had had the operation and I didn't know they'd had the operation. They came forward. They were able to tell me what to expect in ways that I never would have gotten from a doctor's pamphlet. It inspired other friends of mine to go get tested. No matter what the government says, if you're over 40, get tested.

I just saw this incredible benefit, and nothing bad happened as a result of doing this. One man, who didn't like me anyway and had already blogged about that anyway, wrote a blog post accusing me of "over-sharing." It's really an odd word when you think about it, because what he was really telling me to do was shut up. He was telling me that I shouldn't say these things. But how dare he tell me what I can and cannot say?

The problem here was not that I was over-sharing; the problem, of course, was that he was over-listening. There are so many mechanisms today to cure that. On Twitter, he can "unfollow" me. On Facebook, he can "defriend" me. He need not link to me. He need not click on me. Leave me to do what I want to do. Leave me alone, and leave you to find your world where you want it. I can be public and he can avoid me. And that's fine.

This was an effort to try to control me. I don't like that very much. We geeks don't.

If I'm going to fight for this notion of publicness, then I had to go through and figure out how it matched to privacy, what the benefits were. Let me first go to how it relates to privacy.

You mentioned the kind of conclusion I came to in discussing a definition of privacy. The problem with definitions of privacy is that it often comes down to—Helen Nissenbaum, here at New York University [NYU], wrote a brilliant book about this talking about how it's all about context [Privacy in Context: Technology, Policy, and the Integrity of Social Life]. She's right about that. When someone tells you something, you need to be aware of the context of that. But trying to figure out what that context is, is very hard, and trying to codify that is very, very hard.

That somewhat inspired me, along with a woman named Danah Boyd , who is a brilliant researcher, also now at NYU with Helen and at Microsoft—and Danah does a tremendous amount of work with teenagers and the Internet and understanding what they really go through on the Internet, and is wonderful about that. Danah argues that we should not regulate the gathering of data so much as the use of data, the use of information.

The example she gives is, if I walk into your office applying for a job and you see that I have prematurely gray hair, you can surmise my age approximately. You certainly can see my race and my gender. You can guess at my education and intelligence. You can use some of those factors in hiring me and others you can't use. But you know them.

You can decide, "I'm not going to hire that old fart." And you'll get away with it once. But if you do it two or three times, there's a pattern established, and you know what's going to happen. We'll come after you, because we're regulating in that case the use of the data. I can't make you not know something that you know.

That's the absurdity of some regulation of the Internet, trying to cut off this idea of what the Internet knows. Google should erase that. In Europe they are talking about regulation of the right to be forgotten. It sounds perfectly reasonable. You must erase this stuff about me on the Internet.

Would you say that to The New York Times? "Gee, I don't like that story you wrote. I demand that you erase it." No. That impinges upon the free-speech rights of The New York Times. To tell Google that it must not link to this thing or tell my blog that I may not have this thing up, if it's not libelous and it's not illegal—you just don't like it? Welcome to life.

So the use of data is important.

So I realized, inspired by Danah and Helen, that privacy, as I said, is an ethic of knowing someone else's information. If you tell me something, it is now public to that extent. What I do with it—the responsibility lies on my shoulders. I do have to guess the context of that.

Was that a private conversation? Was it something that you wanted secret? Was it something that may be beneficial if I share it with the world? Would you like it if I bragged about this? Those are decisions that I have to make now. If I'm the company and it's your credit card number, I am honor-bound to hold on to it. So there's an ethic there of knowing someone else's information and what you do with that.

Publicness, I think, is an ethic of sharing. If I know something, the decision to share it is mine and should be mine. That's where the line of privacy and publicness comes. But I also should be aware that there might be benefit in sharing this. When I shared my prostate cancer, if I got two men to get tested and it helped them, then that was something beneficial that came from my sharing. No one requires me to share it. No one should require me to share it. But the fact that I did has some benefit.

The fact that since 9/11, I have gotten one heart condition and two cancers—I'm fine—that probably means nothing. But the fact that I share those data points, and if everyone else who was there that day shared those data points, we might find out that there are correlations there that might help people to know that the condition might be coming and they should watch out for it. So there's a generosity to sharing.

I want to talk about the benefits of sharing in this. I think the benefits are many.

First and foremost, sharing enables us to create and improve relationships. Because we are public, we get to meet and link to other people. You can sit in your room all day and no one should force you out of that room. But if you do, you don't know what you're missing. You come here this morning; you sit at a table; you are in public; you meet other people; benefits come from that. It's the same as the Internet.

The Internet is virtual tables. We meet other people and we connect with them and we link with them, and things come out of that. You don't have to do any of that. You don't have to share. You don't have to show your face. But when you do, you can make relationships. Out of those relationships come great things. We are social by nature, and that is what is enabled by this.

Publicness, I think, leads to trust, especially for companies and governments. We can talk about that more later. It leads to, very importantly, collaboration. Because we are working together, we can collaborate. We were talking earlier about science being open and the fact that scientists can now share their information. It's why the Internet was invented, to share their information more fluidly and instantly and find each other and find more information, which leads to an explosion.

Back to Gutenberg. Before the printing press, scholars had to go to the books. After the printing press, the books could go to the scholars. Before the printing press, you didn't have the ability to so easily compare knowledge and information. After the printing press, you could put two books next to each other and see what was new. It led to the flowering of the scientific revolution.

I think we're in a similar mode today. I just read a wonderful book—a man I think you should have here named David Weinberger—a book just out called Too Big to Know, a friend of mine. He argues that knowledge becomes different in the network. I'll let him make that argument more eloquently than I can.

So publicness leads to collaboration. There's a company called Local Motors that is trying to build cars publicly, design them publicly. And it's doing that. It has a design called the Rally Fighter, a muscle car that they are making in a microfactory in Arizona. They hold public competitions for the design of the car.

For this car, a young man 20 years old won the competition and got $20,000. Then the community designs elements of the car together. The CEO of the company, Jay Rogers, is still in charge of putting out a safe car and a financially viable product. He's still the CEO. But this input from this community is incredibly valuable.

My favorite story about this is that someone designed a taillight lens for the car that the community loved—"Wow, we must have it!" Jay said, "Okay. But I looked into it. I have to tool up to make that part specially. It will add $1,000 to the price of every car." The community said, "Never mind."

They went through a list of other parts, and they settled on a $75 Honda taillight lens. I would never know it's from Honda. It's designed in beautifully.

What amazes me about that is that the community of customers—and there can be such a thing—was making design and economic decisions in collaboration with each other and with the company; given the opportunity, given the tools, and given the respect to do so. That can only happen because the design happened in public.

What does Detroit do? Detroit keeps it literally under a cloak until they pull it off at the last minute and say, "Oh, our new car!" and we all yawn.

Indeed, I talked to Rogers, the CEO, a few weeks ago. He said that now some of the big car companies are coming to him to get help in design, because he has figured out how to harness this creative power of customers in public. It's a very big change.

So publicness leads to collaboration.

Publicness, I think, disarms the notion of a stranger. When you meet each other in public, you start to know somebody else. I have met people around the world, from Iran and Iraq and China, thanks only to the Internet, people I never would have met otherwise.

Publicness, I think, also disarms the stigma. The greatest weapon that gays and lesbians in this country had, I believe, was their publicness. I want to emphasize here that I'm not suggesting that anyone should be dragged out of their closet. But those who chose to go out of the closet and stand in public and say, "We're gay. You got a problem with that?" as we say in New Jersey, forced the bigots back.

They de-stigmatized this issue and said, "What's the big deal? The secret you made us keep all these years shouldn't be a secret, isn't a secret now. So now what?" I think it has led to this flowering of tolerance.

Now, some could say it makes us too tolerant. That's the fight we have in society about norms. A presidential candidate says when asked last week, "You marry two men or you can marry your dog." All right, we're trying to figure out where the line is. But publicness enabled this.

So I see all these incredible benefits of being public. I'll emphasize again that it's not just benefits that come to this; it's also dangers and risks and so on.

In the book I look at this at a personal level—what it brought to me personally, what it brings to people personally, how to share wisely, and what comes of that—and come up with some rules. Let me take a little detour for a moment.

Among my rules about wise sharing is what I call the cabernet rule, which is that one should never blog, tweet, or do anything online after having had too many glasses of a nice cabernet. I violate the rule regularly.

One night I found a loophole in the rule. I'd had some nice glasses of pinot noir. I was watching news about the debate about our supposed debt crisis in Congress at the time, and I just got mad. I came back to my keyboard—and I will censor this for your sake—and I spelled out, "F___ you, Washington. It's our money, it's our economy, and you're messing around with it."

People encouraged me, which is the wrong thing to do. So I kept going on in Twitter. You're in a conversation and you're having fun, like in the bar. I said, "We should make a chant."

Somebody said, "No, you idiot. That's a hashtag."

Does anyone here not know what a hashtag is? Okay, fine. A hashtag is what we use on Twitter. It was something invented by the users of Twitter to say, how do we gather around a topic? You can search for any word in Twitter, the same way you can in Google or Facebook or anywhere else, but the problem was, if I wanted to talk about privacy and I used the word "privacy" in some other context, I would get junk back. But if you want to have a discussion explicitly about privacy, you put a hashtag, the gate-mark symbol, in front of it. Those are people who explicitly decided to talk about that topic. You search on that and you get the discussion.

When I said, "Let's have a chant on Twitter," somebody said, "No, you idiot. It's a hashtag."

So I put "#F___ you, Washington."

I hope this doesn't go on my tombstone as my proudest moment in my career.

But it exploded in a way that I could not have imagined. Some people got mad at me and said, "You should have said your nastiness to this member of Congress or to the GOP or to this person or that person."

"Jarvis, you're playing into the generic anger against government with the Tea Party." Maybe.

But what happened was that people filled in the sentence. It became an empty vessel in which they said what mattered to them.

"F___ you, Washington, for making my parents nervous about whether they can pay their bills next month."

"For not letting me marry who I want to."

"For not being able to negotiate any better than a 3-year-old."

A hundred-and-ten thousand hashtag tweets later, I learned a lesson. This was an empty vessel that enabled people to organize around an idea.

A few weeks later comes, for me, the substantiation of that lesson, which is Occupy Wall Street. Nobody knows what it means. There was no hierarchy. There was no organization. There was no structure. There was no spokesman. It was an idea. People gathered around that idea. That's what this tool of publicness allows—Arab Spring, the Indignados of Spain, the fact that in Iceland they are rewriting their constitution on Facebook right now.

Yes, they are fun geeky tools. And, no, it was not a Twitter revolution in Egypt. It was a revolution of brave individuals, in Egypt and Tunesia and Syria. But these tools help people organize. The confounding thing about Occupy Wall Street is that it doesn't have an organization. It doesn't have a credo. It doesn't have a message. They are figuring out what that is as they go, just the same way we in society are figuring out our new norms around things like privacy and publicness and what's right and wrong to do in public.

That's a key benefit of how these tools let us work as individuals and join together in groups.

As far as companies go, I think they need to learn this way, because there's no choice. Companies are, in a sense, public organizations. They deal with the public. They can't hide anymore. You can't sit back and think that you can do things in your private room and never be affected. The conversation is going on about you anyway everywhere, and if you're not part of that conversation and if you're not open and transparent, you're screwed.

I wrote about this in my last book, What Would Google Do?, out in paperback now, a kerfuffle that I had with Dell Computers, where I had problems with my computer, I went on my blog and complained about it. They ignored it for a long time, and then they finally turned around and really learned that they had to deal with these bloggers online, and they thought it was actually beneficial. So they went and found the bloggers' web problems. They solved their problems. The bloggers then turned to the Internet and said, "Wow, you won't believe it. They came and fixed my problem."

There's a guy named Frax cnk Eliason, who was at Comcast, and he had the job of dealing with customer complaints for the cable company—not a great job. I just read his book that he just wrote about this [At Your Service: How to Attract New Customers, Increase Sales, and Grow Your Business Using Simple Customer Service Techniques]. He started a Twitter handle called "Comcast Cares." He answered people's questions. He answered their problems. It changes the view of a company, the relationship it has with the public. Companies can do this.

Government: Clearly government should be far more public. Government must become public by default and secret by necessity. It's the opposite today. Government is secret by default and public by force, whether that force is a journalist or a reporter digging, whether that force is a leaker or WikiLeaks, whether that force is a whistleblower inside government.

Right now government just presumes that we don't want to know this. They make it hard for us to know things, not always on purpose. There wasn't an Internet before. But try to find legislation in process in Washington or certainly in state government. It's very, very difficult. Try to find out what regulations are going on in government. It's very difficult. We need to make government far more transparent.

There are some efforts. The Obama administration promised to be more public. A guy named Micah Sifry wrote a very good book about publicness [WikiLeaks and the Age of Transparency]. He was a supporter of Obama, but he's very critical of the failure to reach many significant goals in publicness, in transparency, in this administration. There are steps forward, but it's very difficult.

Part of the problem here is that we have to change our relationship with government. The issue for government, I think, is that they have no license to fail. They can't try things. Google can put out a beta product and say, "This isn't finished. This isn't done. See what you think of it. If it works, great. If it doesn't, we'll fix it." If government does that, politically it's dangerous. It's up to us, then, as citizens, to demand experimentation and innovation and trying things—and, yes, failing at things. It's very, very hard for government to do that.

I write about an effort called Peer to Patent. Beth Noveck, who was Obama's first transparency czar, wrote about this, where there is an effort to fix our very broken patent system by bringing experts in to help out with the process. I thought that patent examiners would hate that, because they must be bureaucrats and they must be protective of their turf.

Beth said, no, it's the opposite. They are very lonely. They are harried. They are overworked. They don't have all the tools and expertise they need. If people would come in and help them, it would actually improve their job. The Patent Office is taking this over.

That's an example where we are using transparency in government, not to go get the bastards—and there are still bastards to get, with red hands—but, instead, it's a way to collaborate. If we make the work of government public, for the public, then we have this benefit.

Indeed, it affects our very definition of what a public is. This is a hard thing to get one's head around—to get my head around. What is a public? We're re-forming that now. Is a public a nation? Look at the Arab Spring. It cut across nations. Look at Occupy Wall Street. It cut across nations—or interests that cut across this idea of what a public is. I think we're going through a transformation that is perhaps just as big as that from Gutenberg.

Let me talk for a moment about two last things, and then we'll start discussion.

One is the notion from a bunch of academics at the University of Southern Denmark that they call the Gutenberg Parenthesis. They argue that we went through a big change. Before Gutenberg, knowledge was passed around scribe to scribe and mouth to mouth. It was more process than product. It was changed along the way, as scribes miswrote things or things were done word of mouth. It had very little sense of ownership or authorship.Look back to Jonathan Swift. Knowledge was then really about trying to find and preserve the knowledge of the ancients, who were revered over current knowledge.

Then comes Gutenberg and, as McLuhan would say, our life changes in fundamental ways. Knowledge became—our view of the world, as well—serial, became linear. McLuhan says that the line—the sentence is an example—is the operating principle that we have here. It's linear. It has a beginning and it has an end. It is owned. It is authored. It is a product more than a process. It reveres current knowledge, current scholars, current experts. That's the Gutenberg Parenthesis, 500-some years, and it was a wonderful thing.

But they argue that we are now coming to the other end of the parenthesis into a time when knowledge is once again passed around, click to click, link to link, changed along the way, added to along the way. It changes our sense of knowledge. There isn't a beginning and an end. Look at news online today. It's not all packaged in a neat little story in a box. It keeps going on and going on.

There is less of a sense of ownership and authorship. Witness the fight we're having today about SOPA and the effort to protect the idea of ownership of knowledge.

By the way, I'm not against copyright. But there's proper use of it and improper use of it.

What these academics argue is that when we went into the Gutenberg Parenthesis, it was very hard for people to get their heads around this. It changed our cognition of our world. Where we come out on the other end, we're going through the same thing now. We're changing our notions of how our world operates, and our norms and our ethics and our behavior and the structure of society and the structure of institutions and the death of institutions and the birth of new institutions.

So we have this ability, I think, today to create publics. That's what this is all about with the Internet. We can create publics. We can all be public and create publics. That's an incredible power.

Habermas argues that the public sphere emerged in the 18th century as the rational, critical debate in coffee houses and salons became the first counterweight to the power of government.

There's a wonderful group of academics at McGill University in Montreal, which brought together academics from elsewhere to look at the notion of creating publics. They argue that, no, we created publics before that, that the printing press and the public stage and art and markets and printed sermons and printed songs all enabled the creation of publics.

Paul Yachnin, the scholar at McGill, who is a Shakespeare scholar, said that when 3,000 people gathered in the Globe Theater to watch Richard III, they were gathering around the idea of what you do when you have an incompetent ruler. They were gathered as strangers around a public idea. Shakespeare had the ability to create a public around an idea.

I think that's the power that we have today, that we all have now. We all have this ability, not just to produce content, but to create publics. A hashtag can make a public. That's an incredible power in the hands of the people.

I think we're very early in this. The assumption that I think we all make is that this change is volcanic and lightning-speed, to mix my metaphors, that it's really fast. But I have come to think lately that, no, perhaps this change we're undergoing now is actually very, very slow. Elizabeth Eisenstein, who is the key scholar on Gutenberg, argues that the book did not take on its own form as the book until 50 years after Gutenberg.

Books were originally produced with scribal fonts, meant to mimic the handwriting of scribes. In fact, publishing was called automated writing. It's like horseless carriages. We always see the future in the terms of the past, to interpret it, until we rethink it, until a printed book becomes a printed book. It took 100 years, she says, before the impact of the book on society was felt, was understood.

I think we are operating in the Gutenberg years. John Naughton, a columnist at The Observer in London, asks us to imagine that we are pollsters on a bridge in Mainz in the year 1472. I'm going to Mainz on Friday. I'm going to the Gutenberg Museum, because of this fascination I now have.

Imagine standing on that bridge and asking people of the time, "Do you think this invention that Gutenberg created"—about as far ago from this pollster as the web is from us today—"do you think it will tear apart the authority of the Catholic Church, fuel the Reformation, fuel something we'll call the scientific revolution, change our notion of nations and societies, change education, and thus our very definition of childhood?"

"Nah. Not that big a deal."

That's what we're going through, I think, today with the Internet. The Internet is an incredibly powerful tool. We don't know what all it can do. So to try to restrict it and regulate it today down to the terms of the past would be a gigantic mistake.

It is disruptive, and disruption is disliked by institutions and the powerful. That's life.

So we have to protect that tool. Who will protect our tool, finally? Is it companies? I'm a Google fan. I wrote a book about Google. But Google has to be a company. Even though it does protect our interests a lot, today with SOPA and the blackout, it also does what I think are devil's deals with the phone company about Net neutrality and such. It's a company. It can't protect us.

Will government protect us? I went to the eG8 in Paris. Sarkozy was talking about civilizing the Net, and I had the temerity to stand up and ask him to take a Hippocratic oath for the net—first, do no harm —which he dismissed.

This is the point of SOPA. If we put the power in the hands of government and companies to kill a site, to blacklist a site because of alleged piracy, then those same tools can be used by any government to kill speech. That's the issue. There's a principle at work.

So at the end of the book, I come around and I argue that we must have a discussion about principles, that government cannot protect the Internet and its openness because it's out of its interests. The same with companies. We, the people of the net, must. The SOPA fight is the first time I'm seeing the Net, as a public, rise up to a principle. I think it's very, very, very important.

So I propose some principles at the end. I will say right off the bat that they are the wrong ones. But I just want to have a discussion about them, because that's what we're really doing, building a new world around principles.

The ones I propose:

  • We have a right to connect—not that government should pay for your connection to the Internet, but if government cuts off your connection, I think that's a violation of your human rights, Mr. Mubarak.

  • We have a right to speak, our first amendment.

  • We have a right to act and assemble on that basis.

  • Privacy, as I say, is an ethic of knowing someone else's information. Publicness is an ethic of sharing your own information.

  • Government's and institutions' information—not our individual information, but government's and institutions', including companies' information—should become public by default, private by necessity. And there are necessities, again.

  • What's public is a public good. I didn't go into all this here. When Germany pressures Google into blurring photos in Google Street View because people don't like this—pictures that were taken from a public street of a public view—then I say what they have reduced is the public good, the public sphere. If you can tell Google not to, can you tell journalists not to? Can you tell citizens not to? It's an issue of principle. It sounds good, but it doesn't work well in a free society.

Finally, two last principles:

  • All bits are created equal. If a bit on the net can't travel from here to here—it's restricted, it's slowed down, it's detoured, whether that's by a company or a government—if one bit is restricted, then no bits can be presumed to be free. That's the issue today.

  • Finally, the architecture of the net must remain open and distributed. The net itself defies anyone claiming sovereignty over it, whether you are China or France or the United States or Time Warner. That's what makes the Internet the Internet, and it has to stay that way.

So those are the principles that I propose. That's what I have to say about publicness. I'm eager to have a conversation in public about that.

Questions and Answers

QUESTION: Anthony Faillace. I'm a Carnegie trustee.

I'm going to throw out three different things and just let you run with them, as to objections to this new age.

One is that people on the net take legitimate news organizations' work, make it their own, don't pay for it, and thus reduce the incentive to do actual research, reporting, et cetera.

Number two, you said if I want to stay in my room, I don't have to be commented on, but obviously there's a different line. Maybe you could comment on where the lines of privacy are supposed to be drawn. There are all sorts of issues around facial recognition software. I went to a psychiatrist's office, a strip club, had a gay affair—whatever it was. Is that fodder for—

JEFF JARVIS: Brave of you to admit all that right here.

QUESTIONER: Is that fodder for the public?

Finally, there's a critique that says that a lot of it is just a superficial waste of time, that people are out there clicking away on 130 characters, that you don't really learn anything from that and it's taking away from people actually doing more substantial reading and coming to broader conclusions or deeper research than they would in this sort of world of a few clicks.

JEFF JARVIS: News, line of privacy, and triviality.

News: I'm in the news business. I teach entrepreneurial journalism at CUNY [The City University of New York]. I'm all about trying to make news sustainable. But I'm not sure that that is going to be by selling content anymore. It was possible in a world of scarcity, where you owned the only printing press and content and space were scarce.

The Internet is all about abundant knowledge. Indeed, this idea of knowledge, of information—you can copyright the treatment of information, but you can't copyright the information itself. Once it's known by people, it's known by them, and they can do what they want with it. They can spread it around. In fact, that's what makes society, society. That's the good thing.

So I don't think there are organizations, as a rule, that are taking the news and stealing it. They are often linking to it. They are sending people and audiences to that news, if they do it right. The problem is that if the recipient of that link doesn't know what to do with it, that's his or her problem, Mr. Murdoch.

Let's skip to the last one. I think this notion of the Internet as trivial is kind of an institutional worldview that says, "But you still need us. You need us as publishers, as curators, as gatekeepers, as editors to prepare the world for you." Not so much.

Is there value there? Absolutely. I would be a hypocrite if I were training journalists and we didn't need journalists. We still need journalists to do things, like ask the questions that aren't getting answered in the flow of information, like adding context, adding fact checking, debunking—all these things that are values that journalists can do. But rewriting the same AP [Associated Press] story that everybody else has rewritten already—not so much. We don't really need that.

So I think we have to look in journalism at a core key of what's value to add and ignore the rest, which was there by the necessity of the means of production before.

The line of privacy: that's up to every individual to decide, what's private. But we also have lines as a society to figure out. Facial recognition—a good example. In Germany the head of consumer protection has decreed that tying facial recognition software with geographic software is henceforth taboo. You can understand why she is saying that —"Oh, my god, it's creepy, it's wrong. What could happen?"

Turn it around. Imagine how you could use that software to find missing children. Imagine how you could use it to find people who are missing after an earthquake or a tsunami. Imagine how you could use it to find, yes—in fact, they are using it this way in the UK right now—to find terrorists. Better than body searches.

There are good and bad uses of this technology. If we try to cut off the use today because of a presumed or feared use, then we cut off the benefit that can also come: A).

B), we make the technology the object of our fears, when, in fact, it's our behavior as a society that we should regulate. It's the use, it's the behavior we should regulate. The technology can be used for good and bad. If you restrict the technology too much, you restrict both the good and the bad.

QUESTION: Edith Everett.

Also on the issue of privacy, you may recall that when the Patriot Act became law, it was okay for the FBI to go to the library and find out what books you were reading. It took a few very brave librarians to say, "We're not going to keep records of anybody who comes here if you're going to use it that way." That was the end of that.

Fast-forward: My grandson said, "Grandma, you have to have Netflix."

So the other night I tried it. I watched a program. The next day I get an online iPad message: "I hope you liked what you saw last night."

Oh, my god, they're watching what I'm watching? It's upsetting. That's one thing.

JEFF JARVIS: It depends on what you watched.

QUESTIONER: I'll be careful. That's the point. It's limiting.

The other thing is, on a different issue, you spoke about relationships. I'm wondering if they are not a mile wide and an inch deep. When you have 1,000 people that you're in touch with, I don't know how close you can get to half a dozen.

JEFF JARVIS: All right, the first question. Companies should be very open and transparent—this is part of the ethic of knowing—about what they record about you, why they record it, how they use it, what benefit you have. They should give you control whenever possible about how it's used.

But I would argue that that's a great benefit. Netflix makes its whole business on recommendations. Netflix's business is—video stores are now gone, but when they were there, Netflix argued, "We're better than a video store, because we know what you like and we know what people who like what you like, like, and so we can make recommendations to you." That's, in fact, what their whole business is.

Amazon: Amazon is very valuable because it remembers what I buy and it says, "You might also like this." Oftentimes it's right. I find that to be a service, if the company operates in a very aboveboard way. On Amazon, I can go in and say—I brought my wife a piano book once. I can't play the piano. I said, "Stop using that." So I could control the information and what happened with it. But there's a benefit to that.

About relationships: Yes, the argument is made that there is a Dunbar number that says we can have a maximum of 150 friends that we know well. But I think that's defining friend in just one way. I have incredible relationships with people online that I wouldn't necessarily have over for dinner. They don't all have to be the same. But I can meet someone online who I know knows their stuff about something and has an expertise or a passion in it.

I'm on Google+ right now and there are a bunch of photographers there. Trey Ratcliff is this wonderful guy who has been very generous with his advice. I'm not a photographer. My daughter is trying to become one. So I can see the craze there. Trey and I have been back and forth online. We have had wonderful conversations online. We'll never have dinner together. And it's okay. It's a new kind of relationship that is made possible.

There's a woman in the UK, whose name I'm suddenly forgetting, who talks about this as an idea of ambient intimacy. When we talk about our breakfast, that's silly. But on the other hand, if you talk about having gotten your hair done yesterday, when you next see your friend who reads that, the friend already knows that you had your hair done yesterday, and you can get the conversation past that into something deeper more quickly.

These are just new ways for us to interact. They are different, yes. There goes the letter. My grandmother would be appalled that the notion of the letter and the thank-you note is dead. (It's not dead in my house. My wife still makes my kids do it.) But the notion of the letter is going to disappear—gone. That seems so odd because it's so much the basis of our etiquette and how we operate. Well, we find new ways. We have new relationships. Technology can enable that.

QUESTION: Allen Young.

You talk about the need for transparency by government. During the First World War, Woodrow Wilson talked about the need to have open agreements openly arrived at. The first experiment was the Versailles peace agreement, which turned out to be a disaster. Since that time, we have discovered that diplomacy requires a lack of transparency—most notably, for example, in the Cuban Missile Crisis.

Where do you think the government needs to be transparent and where should the government not be transparent?

JEFF JARVIS: I say that government should be open by default and secret by necessity. There are many necessary secrets, the first of which is your citizens' privacy, by the way, security, some matters of diplomacy, and so on.

But we look at what WikiLeaks did and what we see are a few things. One is that government made too much secret. Is it right to just open up everything? No. In fact, The Guardian and The New York Times didn't do that. They selected a certain amount that they believed we should know and did not reveal other things that they believed could be dangerous.

Let's keep in mind that out of the information of WikiLeaks in that leak of the cable documents came the information that led Tunisians to confirm what they knew about their government that led to the revolution. The openness of that information was important in that way.

Of course, this affects how diplomacy can be handled on an ongoing basis. Secretary Clinton is saying some wonderful things about the Internet as an open tool, but she also attacks Wikileaks—I think without the sense of irony involved in Tunisia, for example.

But my point here, I guess, in the end is just this: what this incident does is, it puts institutions on warning that anything could be open, so behave well. Anything could be open. You do need to keep secrets. If you keep the secrets that matter, then people will defend you. People got mad if in certain WikiLeaks things were put out that could endanger people. We do know the limits as a society.

Government, I think, lost its credibility to decide what was secret. It has to regain that credibility. It's doing the opposite. It's trying to shut down more, because that's the reflex here. But I do think that we have the opportunity to rethink what secrecy is. It still needs to exist, but not in the thoroughgoing way that it exists now.

QUESTION: Susan Gitelson.

One of the new frontiers on the question of sharing is in academic publishing. There are suggestions that it's no longer necessary to have peer review. Where is this going to lead us?

Also, on the international significance of the Arab Spring and other revolutions, what happens when governments watch very carefully who has been protesting in Tunisia or, more importantly, Egypt now, and also in Russia, where people are openly criticizing the government? What protection do individuals have once they have published their views and the government may come after them?

JEFF JARVIS: You all ask not only challenging questions, but questions that come from here and here, and I have to remember them both.

The academics: There's a bill that is about to be presented in Congress that would put scientific research that is funded by the government behind paywalls in academic journals. So there's a SOPA-like furor that is starting over this. It's our research; we paid for it.

There are those in science who argue that this old system that makes it incredibly expensive—the journal adds some value in terms of peer review and so on, but the journal doesn't necessarily add so much value that—a lot of scientists argue that they should be sharing their information. Again, indeed, that's why the Internet was made, so that scientists could share information more fluidly.

We find new ways to add the value. The peer-review system—well, in part, that peer-review system is now the world as the peer. The world can now review information when it is public. Scientific journals pay for themselves, not for the scientific work so much.

So I think that an ethic of publicness in research is important. When I do research at my university, and I work with some companies, I say, "If you want to work with me on this, that's great, but it has to be public. That's the rule. It has to be public, because it's a benefit to others, and I'm not going to benefit just you."

So I think there is an academic ethic there that the academy has to grapple with about why it exists in this way. That's a whole other discussion.

The second one—

QUESTIONER: Government—

JEFF JARVIS: Right, thank you. Absolutely important. Let's be, once more, very clear. These tools can be used for good or bad.

The example that I like here is that in Iran the government used crowdsourcing of photos to identify protesters so that they could bring them to their version of justice. Yet in Egypt demonstrators found photos of the security forces, put them online, Flickr took them down, and Anonymous put them back up so that they could be identified and them brought to justice. Two different worldviews, two different sides.

This is one of the reasons why anonymity matters. Anonymity and pseudonymity matter on the Internet. I think that generally Facebook's key revelation, insight, was that real relationships and real people improve the quality of our connections online. But there is a role for anonymity. Some say that it's only a role used by people making snarky comments online. It's used for that, yes. But it's also clearly used for people who are vulnerable in society, protesters, whistleblowers. And the architecture of the net is built around the ability to be anonymous and not to be tracked. I think that is a critical thing to maintain.

So you see in places like China now that you are at the point where if you go to a café, you have to present identification. You know exactly what that's used for. There are some arguments that say we should require real identity to comment on anything in the United States. What's the principle there? What's the precedent we're setting? What's the tool we're creating that can be used by the bad guys, too?

Anonymity can be a pain. It can be the cloak of cowards in a discussion. But anonymity can be lifesaving to someone who is doing important work in an open society.

So I think the best tools that exist—the geeks try constantly—far smarter people than I am—to create new tools and new ways to get around that kind of surveillance. That's what I think is necessary to support this kind of openness in society.

QUESTION: John Richardson.

In the world that I have grown up in, there are certain controls or points of reference that are reliable, like a dictionary or the numeric system. So if a politician says two and two equals five, you know he's an idiot—or uses a word that doesn't exist or uses the wrong word. Those things are wonderful.

On the Internet what worries me is, what are the controls going to be? Is it Wikipedia? There is an enormous amount of factual information that is an on/off switch. It's factual. It's science and technology. There are areas of speculation. There are all the unknowns. But what is purely factual and has only one answer—where is that going to be on the Internet? Do I have to have the Viagra salesman bring that to me?

JEFF JARVIS: If you've had my operation, you might need it.

QUESTIONER: But I mean the commercial side of it is worrisome, because it seems to me that it goes to the trust. What's going to be the dictionary on the Internet? What's going to be the base-10 number system that we have?

JEFF JARVIS: The first answer is no, because the Internet is not a medium. Again, we judge our future in terms of our past. We in media look at the Internet, god-like, in our own image and think that it's a medium. It's not. It's a street corner. It's Times Square. It's people talking. It's the public. So it has all the benefits and all the downsides of that. It is the world.

So we can't look at the Internet and expect it as a whole to be cleaned up and packaged and polished like a publisher's list or The New York Times. It's not going to happen. So within that, we then find systems to try to better find information and better find things.

Wikipedia is a pretty amazing thing. Does it have mistakes in it? Sure. Britannica does, too. But it's a self-correcting mechanism that is really quite a wonder to watch. Sometimes that works instantly. Sometimes it works more slowly. But it's pretty amazing. There are smart people out there who care about right answers and who then argue about it.

David Weinberger, in his book Too Big to Know, says that what we're going to find is that knowledge doesn't become so much a single corpus of the knowable world—which we believed we could have before because libraries could only hold so much. Our shelves were small. Our knowledge was never small, but our shelves were, he says. What he says we have instead is the same world that we will always disagree about. So we come up with new systems and new norms and new means to try to figure that out and negotiate that and try to get to the facts.

Will we eradicate mistakes? No. Will we eradicate bozos? No. But will we come up with better means to judge who's smart and who's a bozo? Yes, that's what we are trying to do.

So you look at a company like Google. The reason they are doing a social network now is to try to get signals as to who is considered an authority on a topic, signals so that they can make better search results to rise up, if enough people think that person is really smart about that topic. Is that foolproof? Absolutely not. Fools can rise, too. Witness primetime TV and the reality shows. Those people rise up on top. So nothing is fixed there.

But I tell my journalism students that when you see this kind of problem, find the opportunity in it. Find the need and opportunity in it. If you see that there is too much news and too much information, then there's an opportunity to become a curator, to find the best stuff, the people who really know what they are talking about, and to become that. We will gravitate, I think, to those kinds of sources and those kinds of new forms. It's a new dictionary.

Here's the other beautiful thing about Google. When they say that they want people who think differently—if you have a classic solution to the problem of misspelling online, what would it be? What's the obvious answer? What do you still use? A dictionary, right? Not Google. Google, instead, looks at all our misspellings and where we end up.

It leads to that magnificent, wonderful phrase, "Did you mean?" That it did algorithmically by listening to us and seeing our behavior. If we all misspell a word and make it acceptable—when I worked at the Chicago Tribune years ago, they had simplified spelling. "Though" was "tho." That's what I had to type. But you knew what it meant and it got there. That's the way Google operates.

QUESTION: Peter Russell.

You have been using the term "publicness."

JEFF JARVIS: Not in your dictionary.

QUESTIONER: In my traditional mindset, I think of "openness" as a very inclusive, wide term. I would like to ask you to define, in your mind, the difference.

Also, one thing that strikes me through all of your very thoughtful presentation, particularly from the public side, is that it seems to me that a lot of what we are trying to get at is the quality of decision making. That relies on good information. Do you think we're getting there? I'm thinking particularly of the public institutions' governance, as you talked about.

JEFF JARVIS: It must be a tradition here to ask two different topics.

We'll find out. We don't know yet. I'm an optimist, to a fault. I tend to look at what's happening that I think is good. I see that we have choices to make, and I want us to make the best choices. But I think we are a more fact-based society, in some ways, than we were before, in the sense, as we were saying earlier, before I got up, that when you had a barroom disagreement in the past, to settle it, you had to go to the library. And you didn't bother; you had another drink and you argued about it.

Now you pull out your phone and you find out a fact. You expect it to be there within 0.3 of a second, and it is. It may not be the right fact. It may have risen up. But we work out systems to figure out what that is. Is there the opportunity for more facts? Yes, A).

B). Are the facts, in fact, being used in the process of discussion? Well, not so much, as we hear in political debate. But I think there is the opportunity for us to fact-check.

I held a session at CUNY a few weeks ago on fact checking. There are a lot of new efforts to fact-check what's said in political debates and political discussion. There are organizations like PolitiFact and FactCheck.org, as well as "by god, journalists should be doing this."

The New York Times ombudsman had a rather amazing column a week ago asking whether they should be truth vigilantes. Yeah, I think that's your job. When they heard something that was off, should they challenge it? I think so. People skewered the ombudsman for even asking it, saying, "Of course, that's what you do."

Are we succeeding at it? No. But are there are lot of efforts? Yes. So I have hope in that.

To publicness: "Publicness," I will admit, is an awful made-up word. It's not fully made-up. It is in some dictionaries. The proper word would be "publicity," but it's freighted with all this marketing meaning now, with messaging and all that, so I did not want to use that word.So I had the hubris to print my own dictionary listing and define publicness.

Here's how I defined it: "The act or condition of sharing information, thoughts, or actions, gathering people or gathering around people ideas, causes, needs—that is, making a public—opening a process so as to make it collaborative, an ethic of openness."

That's how I define publicness as I use it in the book.

JOANNE MYERS: Thank you for a very interesting discussion.

JEFF JARVIS: Thank you so much.

You may also like

OCT 24, 2024 Article

Artificial Intelligence and Election Integrity in 2024

This final project from the first CEF cohort discusses the effects of AI on election integrity as billions of people go to the polls in 2024.

OCT 22, 2024 Video

Ethical Leadership in International Affairs

In this message for Global Ethics Day 2024, Carnegie Council President Joel Rosenthal shares his thoughts on ethical leadership and the role that ethics must play ...

OCT 16, 2024 Video

Empowering Next-Gen Civic Leaders

The keynote event for Global Ethics Day 2024 featured a panel discussion on how we might enhance youth participation and intergenerational collaboration in civic life.

No traducido

Este contenido aún no ha sido traducido a su idioma. Puede solicitar una traducción haciendo clic en el botón de abajo.

Solicitar traducción