Author Archives: enoss

The Internet is Sentient

Prologue

There is a battle going on between the Internet and those who are threatened by the Internet’s values. This includes anyone protecting old business models; incumbent telcos and cablecos, and rights collectives representing old distribution models for intellectual property. Most importantly, the greatest threat is to the power of the nation-state itself.

This struggle is marked by the Internet achieving sentience. I use sentience here to describe the deep symbiosis of the network and the people that use it and the huge quantity of emergent properties that result. The whole is vastly greater than the sum of the parts.

Introduction/tl:dr

The Internet has achieved sentience. It is some different version of Kurzweil’s singularity, but it exists today. The Internet is self-improving, it propagates its unique values and, most importantly, it has now matured to the point where it is able to protect itself from any threat.

This sentience is a fusion of network and people. Recognizing its existence is central to understanding how society will evolve with and through it and to making this symbiosis as healthy as it can be. For many, this sentience is understood on an implied level. This will be better if made express.

It constantly improves

From its inception the Internet has been on a path of continual self-improvement. We can see this in everything from the way that content delivery became more efficient (compression, local caches, CDNs) to the way languages have become more efficient (C++ to Java to Perl to Python to Ruby to JavaScript) to the way that social has become more efficient (the progression from ICQ to Facebook/Twitter and beyond).

Every element of the Internet eventually gets improved upon with huge network effects, and those network effects are just accelerating. Not only is the Internet improving with the number of people connected, it is improving the very people connected to it.

It propagates its values

The Internet has often been described as valueless. It is not. In fact strong values have emerged. The Internet stands for openness, connectedness, sharing and fairness. It facilitates these things and propagates them. Think of the generally increasing difficulty that corruption and unfairness are having, from dictators to unfriendly business practices.

The Arab Spring and Verizon removing an unfair $2 surcharge are equally good examples. A groundswell starts and does not come from any one voice, but seems to come from the collective consciousness. It seems to come from everywhere at the same time.

It responds to threats

The Internet itself now clearly responds to things that threaten it. The response is multi-faceted, coming from many different places and in many different ways.

We see responses to legislative threats like SOPA and ACTA where people all over the world have responded to specific national legislation in a way never before seen.

We see this when we see Anonymous and others attacking various parties and entities that threaten the Internet and its values, almost like mutated white blood cells in the Internet’s bloodstream. And of course there is no specific person or thing that is Anonymous. Watching those threatened take a traditional approach to stopping it approaches farce.

We see this when we see plans arise for alternatives to the present Internet in response to threats from those who fear it. As the SOPA debate kicked into high gear a Reddit discussion on “Plan B” kicked off and has not slowed down despite the immediate threat passing. This was not even close to the only dialogue of its kind. The collective consciousness of the Internet understands the threat as continual and existential and in no way limited to SOPA, PIPA or ACTA. Those threatened, on the other hand, think they have a “PR problem” and that somehow Google and Silicon Valley have fooled people.

Finally, the Internet responding to threats to its existence is happening at an increasing rate. Draw a line through Citizen Lab, EFF and others in the late 90’s, TOR and others in the late 00’s and the responses that we have seen in the last months. This trend will accelerate.

What are the implications of sentience

When we interact with the Internet, whether it is in platform design, marketing or our own personal use of social media we need to keep the concept of sentience in mind. It can change the way we approach these tasks.

When we interact with the Internet we need to not only to expect it to change and evolve but we need to actively think about our own participation in this evolution. We are each a tiny part of the sentience of the Internet.

Perhaps most importantly for this post and for the broad theme of conflict between value systems, any attempt to legislate, regulate or otherwise control the Internet ignores sentience at its peril.

Conclusion

The genie of the Internet is out of the bottle. From now on, its sentience will protect its independence, whether one likes it or not. If those threatened try and squeeze too tightly the Internet will simply get up and walk away. The nation-state can no longer control the Internet but it can do significant damage to it, and in doing so to itself.

The hope is that acknowledging and discussing the Internet as sentient will allow us to approach its relationship with the nation-state, and other interests who feel threatened by it, in a way that more smoothly eases us into the future.

Why We Don’t Like SOPA

The proposed SOPA (and equally odious “Protect IP Act“) legislation is fundamentally flawed in how it works and the damage it is likely to do to the Internet, which has been the greatest platform for innovation the world has ever seen. For that reason we will be joining the blackout organized by our friends at Reddit by blacking out the Tucows Software Download site on January 18th from 8am to 8pm EST (1300-0100 UTC).

The Internet is a global creature. A “Made in the USA” solution will no more work to stop the problems talked of than would one made in any other single nation state. Worse, the US has been at the forefront of ensuring that the Internet has remained free and a platform for innovation for the last fifteen years. With SOPA, or ProtectIP, that leadership will effectively end and Syria, China, Iran and others will not only use the US as a role model, they will also use these actions as further evidence of US control of the Internet and justification for trying to turn it over to the UN/ITU. This is best described by Susan Crawford.

Worse, the legislation itself is fundamentally corrupt. It is bought and paid for by big media, trying vainly to protect anachronistic business models. This has been demonstrated clearly in all of the hearings and the very conduct of the debate. Listening to how deeply uninformed those being asked to legislate this issue are has been nothing short of scary. Watching how support and opposition has lined up has been disheartening. This is the worst example of the kind of fundamental corruption that is at the heart of the US political system currently and is well defined by Professor Larry Lessig. If you have ten minutes please watch this video on the subject. If you have an hour please watch this one.

The Internet is not a corpus, it is not a thing. It is a series of protocols, which are really agreements on how computers will behave when connected to the Internet. Treating the Internet like a thing to be legislated and controlled is as ill conceived as treating “Intellectual Property” like physical property and leads to even greater perversions. If governments squeeze too tightly, the Internet as we know it will simply get up and walk away. It will fracture and split with a “clean” Internet and a much larger Darknet. than there is today, but not one used mainly for file sharing. Instead the Darknet will become the real Internet. Brands will sell things and Media will offer content on the “Cleannet”, but the Darknet will be where ideas are shared, plans are made, memes are propagated and where most of the cool people, including most of our children, will be.

Prohibitions have never worked to change behaviours. They simply make people who fear things feel good and create a new mini-industry for fear mongers to make money off of. They do not change behaviours.

If you wish to get involved we suggest you visit Stop American Censorship, BlackoutSOPA.org and that you follow @tucows on Twitter where we’re we’ll be tweeting regularly about the movement to stop SOPA.

 

Spectrum as Plentiful as We Let it Be

This is a letter we sent to the Manager, Mobile Technology and Services, Industry Canada today and I wanted to share it with you:

We are submitting these comments on behalf of Tucows Inc. Tucows is a Canadian company that has been around since the dawn of the Internet. We were part of the early days of Internet access when Canada had a position of leadership. We have watched with sadness as Canada has gone from an Internet access leader to an Internet access laggard according to objective observers (Berkman Center releases final broadband study, World Top Continents’ Download Speed). As governments at all levels across Canada agonize over our lack of innovation and productivity gains, it is clear that fantastic Internet access – fast, symmetrical, affordable – is perhaps the greatest platform for innovation that any government can provide.

We appreciate the opportunity to share our thoughts on spectrum allocation in particular. We feel that spectrum allocation policies provide one of the best opportunities to rectify the poor Internet access situation that currently exists in Canada. We also note that spectrum policy is based on artificial notions of scarcity. Much like current Intellectual property policy, notions of scarcity were important ideas in the industrial age.

The Internet has truly changed things. The Internet allows us to think in terms of abundance, not scarcity.

The notion of Spectrum as a scarce resource is based on very old science. This is understandable, but changeable. We also recognize that spectrum allocation has become an important source of revenue for governments and that any serious changes to allocation policy need to encompass this point. In this submission we do not propose to address the issue of government revenue, but we do plan to as the dialogue progresses. Our goal here is to introduce the idea of spectrum as a plentiful resource. Specifically:

  • Spectrum is plentiful, not scarce;
  • Interference is a function of the receivers, not an inherent property of wireless transmission; and
  • With smart radios and well-defined equipment specifications we could take much greater advantage of the Spectrum we have.

Following the above would allow Canada to take significant strides in addressing its Internet access issues AND to establish itself as a world leader in telecommunications policy.

As with our submissions to the copyright consultation process in the summer of 2009, we have employed the pen of David Weinberger in an effort to create a submission that is readable and hopefully accessible by an audience wider than most policy submissions are able to reach.

Spectrum as Plentiful as We Let it Be

When I was a lad, our family doctor was a young man named Dr. Murtceps. He took good care of us, and I have stuck with him for over fifty years. The last time I saw him, he shocked me by telling me that he was leaving his practice in order to pursue an important discovery he’d made in physics: over the decades he’s noticed that his loyal patients who have grown old with him are increasingly having trouble with their vision. He leaned in and said with a firm voice that seemed a hedge against the alarm he felt, “I am very much afraid the evidence points in only one direction. There is simply no other explanation.” I waited. “Photons are failing.”

I have to admit I laughed at first. “Doc, you’re joking, right?” I said. “There’s no problem with light. The problem is with our eyes.” He looked at me uncomprehendingly. I struggled for an analogy, but the one I found apparently just made matters worse: “Next thing you’ll be telling me that radio interference is a property of spectrum, and not just a problem with bad receivers.”

Dr. Murtceps wasn’t joking. Neither was I. (And I, unlike Dr. Murtceps, am not entirely made up.)

Unfortunately, our misdiagnosis of the situation with spectrum is analogous to Dr. Murtceps’ taking the weakness of our eyes as evidence of a limitation of photons. The price for being wrong about spectrum, however, is not even slightly laughable. By continuing to treat interference as a physical limitation of the medium itself, we will drastically restrict the usability of spectrum at what could be the highest opportunity cost in history.

Here’s a simple experiment. Plug in an expensive radio next to a cheap one. Play with the radio dial of the cheap one until you find a station with a signal made cruddy by interference. Now tune the expensive radio to the same station. It is likely to have a much clearer signal than the cheap one. Where did the interference go? Nowhere, because interference is not a thing or a property of radio waves. Interference is receiver failure. As far as I can gather – I am not a physicist – radio waves don’t really bounce off one another and knock each other out of shape. They may degrade over distance, and physical objects in their path may diminish their strength, but when one radio wave meets another radio wave, they pass right through each other.

The static and fuzz we hear when we talk about interference is caused by the inability of the receiver to distinguish one signal from another. The system we’ve designed solves that problem by assigning broad continuous swaths of spectrum to designated broadcasters. Fine, but that solves the “interference” problem by limiting the amount of available spectrum: We take the continuum of frequencies and divide them into a relative handful of ranges of frequencies (= “bands”).

At one point, that was a reasonable approach. Unfortunately, that point was in the 1930s. Eighty years ago it made sense to regulate who could broadcast at particular frequencies, and to make the assigned swath of frequencies quite broad: if all you have is a blunderbuss, then you do best if you make your target the size of the broad side of a barn.

There are two important good consequences of our current practice of auctioning off slices of spectrum: First, the government raises lots of money from businesses that then make yet more money from the transaction. Win-win is a good thing. Second, we get a system that works.

The problem is that we’ve defined “works” disastrously narrowly. The current system works in that it enables a handful of large corporations to provide quite reliable one-way broadcasts. We are now, however, witnessing a worldwide redefining of “works,” so that a system that does not allow maximum multi-way communication and maximum innovation is broken.

The first half of that criterion— maximum multi-way communication— addresses the cultural, social, and political benefits. Call it free speech, call it open culture, call it open group formation, call it a renaissance, we all nevertheless know it’s just waiting to happen.

The second half— maximum innovation— addresses the economic reasons why we should care so deeply about getting the Internet’s broadband infrastructure right. It’s where growth is going to come from, and it has the potential to be the greatest market-based generator of wealth since we invented open marketplaces.

So, how do we make this new definition of “works” real? Fortunately, what has been keeping us from opening vast new quantities of bandwidth is primarily our old habit of assuming that we have to divvy up the continuum of usable frequencies into thick, scarce bands. Notice that the fundamental verb that gets applied to “spectrum” under the old assumptions is “divide.” Dividing is a verb of scarcity.

We could instead assume abundance. Today’s receivers and transmitters are smarter than they were when you had to turn a dial to tune in a station. They can do what we do when we drive a car on a highway: change lanes to avoid over-crowding and thereby maximize throughput. In this case, the lanes are frequencies. If the frequency the receiver and transmitter are using is getting crowded, they can send a signal and hop over to a different one. This type of intelligent, dynamic spectrum management wrings far more capacity out of the airwaves than doing the equivalent of assigning each car its own fixed lane.

Assuming abundance can create abundance because information is not like a car. Over the years we have figured out ways to compress information, to combine multiple signals into a single “lane,” and even to create what David P. Reed (one of the architects of the Internet) calls “cooperation gain”: an improvement in information capacity as more nodes join, say, a multi-hop mesh network.

It’s vital that we drop our assumption that information needs a dedicated, unvarying channel. When broadcasters need to get assigned a “lane” by a centralized authority, it’s expensive and slow to become a broadcaster. Where frequencies have been opened up to all comers in a free market, enormous innovation has occurred already. Imagine if the public airwaves were in fact open to the entire public, with their management handled in real time by the technology itself, rather than by a government office. It would be like the Internet, and we know the result: There are currently 200 million registered domains, and we have just run through 4.3 billion Internet addresses. That happened for two reasons. First, the Internet enables anyone to jump in, without first having to apply for permission, pay a fee, or hope to be assigned a route; the Net gives participating computers addresses, and negotiates the routes between them dynamically. Second, we are a damn innovative species just waiting for the chance.

But, we are being held back by an old way of managing a resource that works by turning it into something scarce. Spectrum is bounteous if we want it to be. It will be a tragedy for which we will not be soon forgiven if we continue to slice this shared abundance to ribbons that we then sell off for short-term gain.

Open up the spectrum and we will figure out what to do with it. Trust us. We’ve just spent the past fifteen years proving that we’re more innovative than even the craziest of us imagined.

Thank You.
Elliot Noss
CEO
Tucows Inc.

Help Save MySQL

Help Save MySQLOne of our core values at Tucows is that the Internet is the greatest agent for positive change the world has ever seen. And we strongly believe that open source tools are central to the continued growth and health of the Internet.

You may have heard that Oracle has acquired Sun, and along with it, MySQL, which is is a central building block in the suite of open source tools.

While MySQL holds that position in the world of open source software, it is not important to the Oracle acquisition of Sun. In fact, all of the most important reasons for Oracle doing the acquisition would still be in place if Sun had no role whatsoever with MySQL.

The “cost” to Oracle of freeing MySQL is very low. The benefit to the world is extremely high.

With that in mind, please consider lending your support to the campaign. We’ve added our support by signing the petition. The Save MySQL website has lots of information about why the community feels that MySQL is important, and what you can do to ensure it stays open and freely available to the entire Internet community.

Bringing More Efficiency To The Secondary Domain Market

Some experiences in the last couple weeks have me thinking about the need for registrars to rethink their approach to the secondary market for domain names and how we deal with each other when high-value domain names are hijacked. In this post I would like to briefly examine this and make a specific suggestion that I believe will help in credibility and therefore efficiency.

There is no question that the secondary market for domain names has become much more efficient. The number of transactions involving high-value domain names has greatly increased which can be seen simply by looking at the weekly results from Buy Domains and Sedo (for the purposes of this post I am thinking about transactions greater than $500). We can also see greater efficiency with the maturing of the various listing services (DDN, DLS, MLS) and with greater integration by registrars of secondary market domain names in their domain name search results.

The last few years have seen a huge increase in the importance of the secondary market for domain registration relative to the whole domain name economy. While many of the major players are the same, there are also important differences and those differences require some fresh thinking about how to make the secondary market more efficient and more effective.

Of course as this market becomes more lucrative it attracts more “bad guys”. Anecdotally, all the large registrars are seeing increases in the number of hijacking attempts. When aimed at registrars themselves, these seem to be well dealt with, but when these hijackings stem from a hack aimed at third-party email services there is little that registrars can do at a system security level.

We have been involved in two situations recently, one where we were in receipt of a domain name that was thought to be obtained illegally and one where a registrant of ours had a third-party email address compromised. In the first, we worked with the losing registrar and, with the proper protections, returned the domain name to them. In the other, the gaining registrar felt their obligation was to their customer who claimed to have obtained the allegedly stolen domain name from a third-party. They would not help us at first instance. I expect this latter situation to be worked out but it did have me thinking.

With the secondary market the players are different. There is essentially no registry involvement and, probably more importantly, there is no formal role for ICANN to play other than as it relates to its contracts. As well there are additional players, specifically owners of high-value names and the various secondary market marketplaces.

These secondary market transactions are of a much higher dollar value than those in the primary market. They warrant a different approach.

Of course there are best practices and additional security measures and services that all owners of valuable domain names should avail themselves of. I expect these services to greatly increase in both scope and sophistication in the coming year. And of course their adoption will not be universal.

I believe that registrars should develop a more standardized approach as to how they deal with these situations. We should set out appropriate practices. Of course there will be exceptions and of course any guidelines cannot be too proscriptive. BUT if we are effective in doing this we will accomplish two things. First, we will make the market safer for those customers who own high-value domain names. Second, we will make things much more difficult for those who attempt to steal the property of those rightful owners AND for those who provide liquidity for the hijackers by buying the stolen property, often with little repercussions.

While in Korea this week for the ICANN meeting I will have the opportunity to meet with representatives of most of the major registrars. We all have an interest in making the market cleaner and more efficient. It is still early days and I have no doubt that this will be warmly received as would any input from other interested parties.

The Sad State of Broadband in Canada

In response to poor results in recent OECD tables and a number of other benchmarks, Canadian telcos and cablecos have fought back by commissioning a “study” to respond to criticisms about the (in my view abysmal) state of the Canadian broadband market. The author concludes “Canadians have access to some of the most affordable services, while also benefiting from some of the world’s fastest connection speeds for both wireline and wireless broadband services”.

cable modemSadly, it seems only he agrees. In my role at Tucows I have the pleasure of traveling all over the world and having customers who are service providers all over the world. We are always discussing access markets. I could bore you with story after story but very few countries have slower, more expensive access offering than us in Canada. A fantastic study done for the FCC by the Berkman Center for Internet Studies at Harvard is just the most recent to confirm the sad state of broadband in Canada.

It is not that the author is incorrect, rather he is misleading and the document is more of a telco/cableco marketing document than a study. I will identify some specific criticisms.

First, and most importantly, is the definition of “broadband” which sets the benchmark from which all measurement and conclusion flows. The “study” uses 1.5mbs as its threshold. 1.5mbps! I believe this was the launch speed for Bell Canada’s dsl service in 1998. 1.5mbps as “broadband” borders on nostalgic. This, more than anything else, takes this from “study” to “attempt at persuasion”.

It is as if we were talking about hunger and debating how many Canadians are starving. I, and many others, are lamenting how hungry we are. We are complaining that in a country like Canada we should be eating MUCH better. Eating is important for health and innovation and jobs. And the telcos and cablecos have produced a “study” that assures us that we are in great shape. That in fact the whole country has access to a bowl of gruel every day. That we should be celebrating our leadership, not lamenting our laggard status. That we have healthy, competitive markets that are doing just fine thank you very much.

My second complaint is in the $/mbps analysis wherein the author concludes that we are not nearly as bad as other studies indicate. He uses as his sole basis for the analysis a Videotron service that is $80/mo for 50mbps. First, he ignores that this service is very limited in coverage and that a similar service from Rogers is $125/mo. Second, he lauds the fact that this moves us from 28th to 8th on the world tables. Never mind that this is only for OECD countries and that there are dozens of non-OECD countries who have far superior offerings. But 28th to 8th? It is like watching CBC coverage of Canadian athletes in the summer Olympics! “Just look at that top ten finish!”. Last, and most importantly, it completely ignores upstream bandwidth.

Rogers recently launched a 50mbps service to limited areas in Toronto. It is only “up to 2mbps” upstream! Quick story. My son (11) spent last weekend hard at work on a video for a charity project that his class was engaged in. After many hours and missing much of the weekend’s fun he finished his slightly over 3-minute video which naturally included some video clips that were HD. To upload that video to Vimeo took three tries and 45 minutes (and this was after failing to upload on a couple tries to youtube due to ?). Total time spent on the upload was well over two hours. AND, worst of all, after finishing we were obviously placed in to some kind of copyright-infringing bandwidth hogging penalty box at Rogers and the Internet basically crapped out and took some waiting and a number of router reboots to return to normal.

What parent wouldn’t want their son spending hours on the weekend filming, editing, doing voiceovers, poking at software to make a video FOR SCHOOL. FOR CHARITY! sadly, the current Canadian broadband market not only discourages, but punishes this behavior.

I want, and there is no reason we cannot have, at least 100mbs full symmetrical bandwidth. It is a global competitive imperative. Telcos, Cablecos, I do not want your lousy bowl of 1.5mbps gruel. Please sir, may I have some more?

(Thanks to Flickr user kainr for the photo and for releasing it under a Creative Commons license)

On “The Affirmation of Commitments”

There was big news in the ICANN world today with the announcement of the “Affirmation of Commitments”. This is the document which will now govern the relationship between ICANN and the US government (“USG”) as well as the rest of the world (“ROW”).

This is an important step in ICANN’s evolution in two respects. It signifies a significant move away from formal USG control of ICANN and it further solidifies ICANN’s role in governing the Internet and that governance being global in nature, NOT controlled by the national governments of the world.

Remember that ICANN was created in 1999 and has had three different types of documents governing its relationship with the USG. We have gone from a “Memorandum of Understanding” to a “Joint Project Agreement” to now an “Affirmation of Commitments” (AoC). To quote Bret Fausett, in this ever lightening chain of commitments, what is the next step? Facebook Friends?

Seriously, this removes a serious problem for ICANN. Since its inception ROW has been troubled by the exclusive oversight that the USG had over ICANN. The Internet is global, so should the oversight be. This has led from time to time for calls for the UN, the ITU or some other quango to take over from the USG. The AoC addresses this and gives the ROW a large say in appointing the group that provides oversight to ICANN. This is a HUGE step forward.

Notice I did not say that the ROW has a say in oversight, just in appointing the group that provides oversight. This is equally important. The terms of this oversight are laid out in the AoC and what happens if ICANN does not abide by these terms is also spelled out, the AoC fails and we are back to where we were to try again. This is a fantastic way to allow ICANN to flourish independently and to keep ICANN a global, not international organization. Think of this as a trust and those appointed as trustees. They will determine whether the terms of the trust have been abided by. If there have not been complied with then ICANN reverts to its previous state of USG control and we start again.

In its day to day operations this will not make a lot of difference. There were VERY few circumstances where the USG had a heavy hand. The dis-allowance of .xxx and the occasional burst of input when big IP interests would complain about domain names and copyright are the few exceptions. The USG deserves a thanks for its role to date.

It is also very important that we (and by “we” I mean “we the Internet”) have avoided the UN or the ITU. Either would have been disastrous for ICANN in my view.

All in all a good day and another positive development in the young regime of Rod Beckstrom as ICANN CEO. Now let’s see if he can thread a needle on new gTLDs!

The True Value of URL Shorteners

I have been watching the discussion on URL shortening that followed the funding of bit.ly with great interest and some surprise. Josh Schachter started it off. Dave Winer, Cory Doctorow and Howard Lindzon, among others, followed. The points raised are indeed interesting, but what is so surprising to me is that the answer to all of the concerns is not only so simple, but right in front of their noses.

First, some background. URL shortening has been around for years. It long preceded TinyURL and has always been good business. We got into the business in 1997 with Domain Direct, a service that dealt with what we called at that time the “~ problem.” This was the long and embarrassing URL that came along with the free webspace most ISPs provided at the time. It also dealt with the long and embarrassing URLs that came with free websites from the likes of Tripod and Angelfire.

TinyURL and the like came along years later with the purpose of making the sharing of temporary URLs (blog posts and news items mostly) much easier, but they are not as effective as a domain name for permanent URLs, like http://noss.org/work. The experience with Domain Direct and a love of URL shortening was what drove our thinking in coming up with Hover.

URL forwarding services have three goals. They should be easy to use, should make long, complicated URLs short, and the resulting URLs should be memorable. Easy is a function of the tools (and I do think our tools at Hover are the easiest available). Short and memorable are a function of the semantics.

When looking at “short” we should be clear that it is only in Twitter, and then again only in the rare Twitter post, that “hyper-short” matters.

It is with “memorable” that the difference really emerges. So let me be clear. The best “URL-shortening service” is simply a combination of great tools and your own domain name. The difference in using http://noss.org/bitly and http://is.gd/pind is huge in terms of “memorable.” Not only is the shortened URL easier to remember; it becomes a bit of personal branding (I hate using the word “branding” in this context but I do not have a better alternative. The whole concept of earned media is definitely relevant here), especially when the shortened URL is shared forward by a third party!

Of course, using your own domain to create forwards also addresses all of the concerns of control, archiving, spamminess and other evils that were raised in the original posts and elsewhere. I have now had this conversation with three hardcore geeks and when I say “Look, the answer is simple. Just use your own domain and CNAMEs!” they just stare at me and say “Oh yeah. I never thought of that.”!

Some Thoughts on ICANN’s Next CEO

Now that the search has officially commenced, I thought it might be useful to make some public statements as to what I would like to see from the next ICANN CEO. My comments are driven by what I see as the deficiencies over the last number of years and, most importantly, by a deep desire to see the ICANN experiment in global governance succeed. The Internet is the greatest agent for positive change the world has ever seen and a healthy ICANN strengthens its ability to foster positive change.

For me there are three essential qualities required and they are tough to order because I would like to see them all. They are as follows:

  • a deep love and understanding of the Internet;
  • the ability to “run a business” responsibly; and
  • the ability to lead with vision.

First, a deep love and understanding of the Internet.

“Choose a job you love, and you will never have to work a day in your life.” ~ Confucius

For me, Mike Roberts was the best ICANN CEO to date and the reason is that he was the one who most loved and understood the Internet. ICANN is responsible for names and numbers, which are about finding and using resources on the Internet. Appreciating what that means and why it is important is central to being the ICANN CEO. Too much of the last few years have been about ICANN as an institution for the institution’s sake, not for having ICANN live in service to the Internet. A great CEO will create and lead an ICANN that lives in service to an open Internet and to the role of names and numbers inside of that.

Second, the ability to run a business responsibly. ICANN as an institution has ballooned over the last few years, seeing its budget grow by massive amounts. I am in favor of a healthy ICANN that is not begging its constituents for money and that is able to provide necessary staff support for policy creation and management. However, the money should be spent like it was their own! There is much too much wasted on very expensive consultants, staff duplication and on unnecessary efforts. There is a good core of credible and productive staff who I believe will respond to this so positively.

The next CEO should be comfortable learning about an issue and making a decision. Rather than pay BCG, McKinsey or some other exorbitantly priced consultant to call me, and a dozen others, to ask for our opinion on an issue, the CEO himself will research a topic and then come to a decision. Yes, ICANN is a consensus-driven, bottom up organization, but that need not apply to every issue. To be clear, I am talking here about day-to-day issues like a new RAA, transfers or whois.

The next CEO should be comfortable making decisions, leading the team and spending money responsibly. They should be a doer. The do/say ratio in ICANN needs to increase immeasurably.

Lastly, the next ICANN CEO needs to be able to lead with vision. So much of what ICANN deals with concerns the future, not the past or the present. The next ICANN CEO needs to possess enough imagination to create a broad vision for the organization and lead staff and the various constituents in that direction. This does not mean they should drive the policy-making process, nor that they should substitute their judgment for the community, but that they should have a big picture view of what the organization looks like when it is functioning well and how the organization exists in service to the Internet.

The organization should not lurch from issue to issue like it does now, constantly fending off imagined existential threats. It should move in a clear direction toward a bright future.

Yes, I know I am looking for a lot in one person, but I really believe that at this point in the Internet’s history, ICANN demands more than a CEO. It needs a passionate visionary.

Restructuring at Tucows

Today we made the decision to restructure our business, which reduced our number of employees by roughly 15%. I have just finished an all-hands meeting where I talked about today’s events with our people.

Our thoughts today are with the people who left us. They were our friends and colleagues, each made meaningful contributions to our business and were liked. We offer them our sincere thanks for their hard work and efforts and good wishes for them as they go forward.

We decided to take this step because of the uncertainty of overall economic conditions and the fact that our performance has been impacted by a number of unanticipated challenges during the first nine months of the year, including advertising revenues being dampened by the weakness in the economy and by reduced payouts to the domain channel by Google and Yahoo, which is in turn impacting domain portfolio advertising revenues and especially bulk domain portfolio sales.

I have also never seen a macro economic environment like we are seeing now. I am old enough to have lived through a number of down cycles but there are elements of this one that make it unique and that will take time to work through.

I am immensely proud of the great work our team has done together this year. The product launches of Butterscotch, Hover and Storefront. The brand launches of OpenSRS and YummyNames, and the smooth email migration to our new platform.

We are luckier than most in that what we sell, domain names and email, is more like milk and bread than like cars and refrigerators. We are also luckier than most in that we generate cash and will continue to.

As we look forward to 2009, I believe we have a strong team who will continue to innovate, to work efficiently and maintain our positive momentum. I fundamentally believe that our strength comes from our people and I look forward to working hard together over the coming weeks and months to exceed even our own expectations.

And again, today, our thoughts are with the people who have left.

My hand hurts, I’ll cut off my arm

Yesterday a large webhosting company, Dreamhost, told the world that, while they would continue to provide email, their email service was not that great and suggested their customers should probably use Google’s Gmail instead.

They provided some fascinating data about email and support costs. My two favorite nuggets:

“Just over HALF of all the support requests we get are about email. Everything else we offer, combined, doesn‚Äôt add up to the amount of trouble, expense, use, and effort that goes into ‚Äúsimple‚Äù old email.”

and:

“If a web server with maybe 750 customer sites on it were to go down for even as long as five hours, we‚Äôd probably get two angry messages about it. But if email goes down for the same number of customers for just five minutes we‚Äôll have already received 50!”

And they are clear as to their view of quality:

“(email is) something the big free email providers like Yahoo, Microsoft, and Google can do better.”

This post was picked up on Slashdot where the discussion, not surprisingly, swung back and forth between “I am a sysadmin managing 20 domains and use Google Apps and Gmail and love it” and, “You should always run your own mail server for privacy purposes and, well, its just plain fun.”

Both the original Dreamhost blog post and the resulting Slashdot discussion completely missed the point. Luckily the comments on the Dreamhost blog did not. They were very clear.

Overwhelmingly commenters said that they often came to Dreamhost for hosted email, they did not trust or want to use Gmail for their business email and many of them would immediately leave if Dreamhost discontinued offering email.

Every service provider should be required to read the Dreamhost blog post and, more importantly, the comments.

Whether geeks like it or not, the vast majority of people want and need simple, reliable email that is easy to use AND they want a supplier who will help them use it. That means providing phone support as well as resources to make things simpler. Support data provides golden information for i) how a service can be improved and ii) what your customer’s needs and wants are. Guess what? People are willing to pay for this.

Contrast the Dreamhost view with that of Rackspace. Faced with, I suspect, the same or similar data, Rackspace responded by going out and buying Webmail.us.

It is amazing to me that because most service providers have chosen to give away email they take that as an existence proof that people do not or will not pay for a quality email experience. People will pay over $80/month for a single cup of coffee per day. People paid Geek Squad over $1 billion last year to “set up” their wireless routers. Every geek knows how hard (or not) that is! My ten-year-old son does just that for my mother-in-law. With regards to email specifically, RIM, the Blackberry people, have a market cap of over $75b JUST FROM PROVIDING A PORTION of peoples email needs!

People, especially small businesses, use email more than anything else on the Internet—much more than they use or need web hosting. Service providers are in the business of making the Internet easier and more effective—whether they like it or not.

Geeks who run service providers may find Gmail great. Human beings, not so much.

Thoughts on the Domain Name Price Increases

I wanted to share some thoughts with all of you on a dark day in Internet history. On October 15th the price of a .com will increase by $0.42, marking the first price increase in the history of the modern Internet. Worse, this now signals a near-annual event that will take place in all major gTLDs. It is simply wrong. My full comments in the public forum in Puerto Rico in June are here.

While I do think Verisign has shown a lack of stewardship of this key public resource, I lay the primary blame for this on ICANN staff who put this forward and on ICANN board members who voted for this (it should be noted that the vote was 9-5. One of the closest in ICANN annals). As I said in Puerto Rico, shame on you. We all, all of us involved in the ICANN process in any way, owe the Internet public because of this.

It is important that we do not use this as a sign that ICANN, the idea, is failing. We should not confuse bad execution with bad strategy. The role of ICANN as an example of truly global, not International, governance is important. The role of ICANN in keeping the Internet free from government control and by that the predation of special interests is vital.

And it is a challenging environment. There is a debate inside the Registrar constituency right now, effectively re-fighting a battle that was already won, but sloppily implemented by staff. Many of you (the “you” here is our customers) will have already dealt with the end-user problems created by Go Daddy and Network Solutions in their “interpretation” of transfers policy in the name of “security”. For me this is simply deja vu.

Service providers, there is something you can do. Something important. There has been a process of GNSO reform going on inside of ICANN for the last 18 months. The GNSO is the primary policy-making body in the ICANN process. They are the ones charged with making policy for gTLDs. The board only has the power to ratify policy. Staff only has power to enforce and interpret policy.

Inside of the GNSO there has been something of a stalemate for the last few years. One of the chief reasons is that the Internet Service Providers Constituency (“ISPC”) has consistently sided with the Intellectual Property Constituency (“IPC”) on things like whois access and new gTLDs. I have been in and around the ISP industry now for 13 years and the ISPC does not look like any ISP assembly that I know.

I have been advocating change in the GNSO reform discussions. In Lisbon in March and again in Puerto Rico in June I have advocated a recasting of the ISPC. My position is that it should be a constituency for companies who stand between the “contracting parties” (ICANN-speak for Registrars) and end users. Most of the industry calls these people resellers (an old OpenSRS anachronism). They have no place or voice in the ICANN process right now and they need one. We have been trying to advocate their interests (your interests) for years. You can do a better job of it than we can.

When it comes to transfers, to whois and to most issues of DNS policy they (YOU!) are a voice that needs to be heard.

My advocating is the easy part of the battle. The harder part will be to actually have some of you folks do it. So take this as a plea to storm the ramparts! Now! In the next couple days we will post more about the ISPC, what can be done, and how to do it here. The time commitment is VERY small and the impact can be very large. Just ask George Kirikos what a little effort can accomplish in the ICANN process!

Why We Bought Kiko.com

On August 26th, 2006, Tucows was the winning bidder in the widely discussed (Techmeme, digg.com, Stowe Boyd) eBay auction of the web-based calendar application Kiko.

Why Did We Buy Kiko?

While there are a lot of little reasons, I'll cover a few of them in a moment, there is really one big reason why we bought Kiko. We needed the functionality, quite desperately, inside of our email platform and it was going to take us a long time to get it. Especially at the level of sophistication Kiko has.

The Calendar Function

Most webmail platforms have a calendar but very few of them are ever used. It is quite simply a crappy user experience. We as users have a problem with shared calendar inside of Tucows and because we are a mixed-desktop environment we are not able to go with the expensive-frustrating-functional Exchange Server solution. At times there have been real pushes for this internally but I have pushed back and insisted that anything we do with a shared calendar be open standards. There is not much.

We all believe that a calendar is a very important function in the messaging suite for small businesses. Given that people don't want to maintain separate services for personal and business use, and because the line between personal and business services is getting blurrier, we felt this functionality was a big hole for us.

So why didn't we build it? Well the short answer is we have so many things to do in general and so many exciting things to do with email in particular that it was just not going to be possible until at least Q2 of next year and even then the plan didn't really excite anyone around here. It looked sort of like the next-gen of our current offering. Had this not come up we would have probably stayed the course and looked to catch a break. When it did, we quickly went through a simple calculus.

The Important Question

What would we pay to have a kick-ass AJAX-based calendar available now?

When I am dealing with quick, complicated decisions I really like to boil them down to a simple abstract construct. Yes there are a huge number of shadings around that question but at its simplest that is the essence of the decision. What was the value to Tucows of the time and the certainty? Of being in the market with this functionality six to twelve months earlier than otherwise? What was the value of having it be good for sure? Even if we threw it away in six months (not that we plan to do that)?

What I can tell you for certain (and you'll be able to hear more details in an upcoming podcast) is that it was more than we paid!

This Situation and Tucows

From the time the auction was announced, there was great discussion online about the value of Kiko to a buyer and much of it was both accurate and confirming. Justin and Emmett (see them being interviewed by Alan Wilensky here, here, here and here) were absolutely right in determining that Kiko was a feature not a business. We think they were absolutely right in assessing that integration with email was key and that the greatest value here was to someone with a suite of services to integrate with. We felt that this was going to be 2-3 man-years of work and they confirmed that. All of this made us more comfortable in the short period of time that we had to make our decisions.

There were also some interesting facts that were specific to Kiko that made it work for us. It was clear from their posts and such that Justin and Emmett were no longer passionate about the calendar space and were excited to do something else. They felt, and we agree, that this was worth much more with them along for the ride. Probably by a factor of ten. It would have then attracted a completely different type of buyer. We would not have paid that premium for the people. Not that they aren't worth it. Just that our financial calculus was different. This probably kept some of the natural buyers out of the process.

We also did not need a huge base of retail users. They are nice and we will provide them with a great home but if this had been much of a success outside of Mike's 53,651 it probably would have attracted more financial buyers or domainers and the price might have ended up more than we were willing to pay. It is worth noting here (and we also talk more about this in the podcast) that there was clearly interest in the domain name and the traffic. We will certainly monetize that as it is a space we know well, but we also may choose to sell the name off as it is not core for us. Either way it is another place where we, more than most/all other buyers who would be interested in the calendar functionality, will be uniquely able to take advantage of the assets.

In a nutshell, this was the kind of deal where we were buying exactly what they were selling. That makes for good business and, by the way, is too infrequently the case with Internet services.

Other Benefits

As we dug deeper there were a number of other little benefits that made this seem like a great fit and got us comfortable pushing ourselves a bit on the spend.

I will call out a few of these, but this list is not exhaustive:

Global User-base – For some the non-US customer set and things like language support may not have been seen as benefits. For us they were a very nice addition. Our business is extremely global with customers in over 110 countries. We have a large European b

usiness and a large South American business. We have plenty of customers in Asia. The customers and languages that come along with Kiko are a nice benefit for us.

Mobile Integration – Kiko has a very impressive set of mobile carriers they integrate with. We were blown away when we dug into this. It will be nice to have that functionality for the calendar. It will be even nicer to have an existence proof for making the rest of our services more mobile. We are just starting to experiment with mobile around the edges of our business and this will help things along.

Nice AJAX Implementation – Kiko is a very nice use of AJAX, especially in a lot of the underlying thinking. To me, that is not about technology, but about making a web app behave more like a desktop app. Learning how this worked within Kiko and having to maintain this code base will be very good for the rest of our services. Again, there is a nice broad application of a benefit to be taken advantage of.

Conclusion

First and foremost this was about better/faster. We were able to get a key feature done well, and done now. In my view we were lucky with a number of the small things that made this happen. The people were not part of the deal which held down value for one group of buyers. The retail user base was real but not too large, which held down value for another group of potential buyers.

There were also a number of side benefits which are important in any good deal. The global user base and language support, the mobile integration and the nice use of AJAX are three examples.

All in, we are quite excited about this, we thank Justin and Emmett for all their hard work, we look forward to giving the existing customers an ever-improving user experience and look forward to bringing a great shared calendar to the millions of end-users and thousands of partners who use Tucows services today.