Net Neutrality Friday

Well, this is my year in review in Telecom Regulatory affairs and Net Neutrality. The biggest news of the year was the imposition of Title II on Broadband Internet Services. If you have read my stuff, you will have seen that I considered this whole episode unfortunate. Early onto the whole Netflix/Comcast debate that catalyzed this issue for the general public, Cogent admitted that they were the cause of the Netflix slowdown. This was ignored and the fact that no ISP had slowed down Netflix was overlooked. Because of this we all decided we had a problem that needed to be fixed. And fix it we did. My issue is that I consider Universal Broadband Service a much higher priority and that will be years in addressing. Even with all the carriers taking Connect America Fund (CAF-II) money, don't expect 100% coverage for a very long time.

The other big issue here has been consolidation. You are seeing this at the ISP level (Time Warner/Comcast), Equipment level (Nokia/Alcatel-Lucent), and Chip level (Broadcom/Avago). I think this means that the industry is slowly deciding that it needs to act as a commodity business at the network level. I agree with this, but it also means that there will be a lot less innovation in networking. An industry that has acted in a similar way is the Commercial Airline business. So, I expect Mergers and Acquisitions to dominate the corporate landscape for a long time. It also means that you will not see equipment startups that win with a very few exceptions.

On the other hand, services are through the roof right now. The players that consumer think about Google, Facebook, and Twitter are all services companies not ISPs. The same is true on the Business Side with a shift that is progressing to Cloud Services and away from Enterprise applications. I would expect this trend to continue for some time. It takes no where near the kind of investment to make a services startup that it does to make an equipment startup.

The one thing that is not happening is the build-out of new networks. With all due deference to Google Fiber, they simply have not added lots of new cities. If you look at the Google Fiber Map, it is curiously overlapping with AT&T properties. There are a few Centurylink Cities, but Verizon is curiously untouched. This leads me to say that Google is trying to spur on Fiber to the Home (FTTH) deployment. With Verizon having broadly rolled out FiOS there is no need to overbuild. I will predict that Google sells their network within 5 years.

The Wireless front has been tumultuous with attempted buyouts and other actions. However, at the end of the day we are now where we were at the beginning of the year for the most part. T-Mobile's Uncarrier offerings have impacted the way the consumer buys service, but the other Wireless Providers have nearly matched and not much has actually changed.

Finally, the Over The Top (OTT) video market is starting to get some real broad traction. I would expect to see more cord cutting over the next few years. If I could get a rational sports plan, I would cut today. But I want to see the Warriors and the NBA League Pass would not allow me to do so.

Have a great Holidays and thanks for reading! Jim Sackman Focal Point Business Coaching Business Coaching, Executive Training, Sales Training, Marketing

Change Your Business - Change Your Life!

 

Sonoma County: News and Notes

Its raining here in Sonoma County and that is a wonderful thing. We need the water and this is our time of year to get it. We get so much of the Sun that we need our rain as well. This week I want to talk about Autodesk's earnings from last month (just before the Holiday). The company had almost $600M in Revenue and lost almost $44M in Q3. The problem that as I have said before that Autodesk is in a conversion from purchased software to leased software (i.e. a subscription based service). The problem is that we are in the middle of this conversion so comparisons to the past are not reasonable. Nor can we understand how the company is doing until the conversion is complete.

Why is this so hard? Well, when you outright buy something the company generally gets to declare Revenue. Yes there are some exceptions under Sarbanes-Oxley, but I will try to keep it simple. When you sell a subscription you earn that money over time, even if the user buys a year at a time. In that kind of model, generally you would take 1/12th of the annual payment each month. To complicate things further, packages are owned outright when bought by customers. They generally do not buy again until there is a new version with features that they want. They do often get maintenance updates for bugs in the current version (and this is often a paid for service). By converting to a subscription model, customers will pay more over time but the company will need to deliver updates and bug fixes on a regular basis.

So, why does this create a problem? You can not directly translate the number of licenses of each product that Autodesk sells into subscriptions that it will sell. Now from the way I read the questions, it has not provided good guidance around how many subscribers will provide replacement revenue for their existing business. Now it might be possible to calculate this from the store, but the questions indicate that nobody has done so to date. On top of that there are multiple ways that the customers can buy subscriptions. This makes the reporting of Net Adds - the number of customers added subscriptions minus those who left - rather meaningless. If Autodesk wanted to make this easier for analysts it would have to provide information like ARPU and Churn data. ARPU is Average Revenue Per User which is a metric on how much an average customer buys on a monthly basis. Churn is the amount of customer loss on a monthly or quarterly basis. This is how most subscription companies report the kind of information that analysts and investors want.

Until we are through this transition, any investment in Autodesk carries risk in this change in execution. However, Autodesk is a fine company that has been around a long time. They are likely to figure this out, even if they stumble through the transition. For the average investor, I would say be cautious until you here something that you can directly relate to.

Have a great day and stay dry!

Jim Sackman Focal Point Business Coaching Business Coaching, Executive Training, Sales Training, Marketing

Change Your Business - Change Your Life!

 

Net Neutrality Friday

I saw a number of things this week that want to point to extending the life of twisted pair copper for telcos. If you are a consumer, should you care about the underlying technology involved? I think there is value in understanding why companies make these decisions from a cost standpoint and then what the differences are in the service that you would receive. One of the bigger ones with copper is that telcos want to use the infrastructure that is in the ground. The problem is that as you want to raise bit rates, you have to have shorter runs of copper. The reason for that is that there is a term called "Signal to Noise Ratio" or SNR. This represents how much random environmental noise is in the transmission path. To think about SNR in a really simple way, think about the quality of FM radio versus AM radio. FM is "better" and has a higher SNR. The signal is clearer. In the world the transmission mediums (from worst to best from an SNR standpoint) are Air, Copper, Coax and Fiber. No matter what you do it will always be easier to transmit high rates over long distances on fiber. There are a couple of other problems with copper. One is that the copper network is built in 50 wire bundles built in 25 twisted pairs. The pairs are insulated wires, but are braided to make the transmission quality better than two wires laid side by side. The wires don't just transmit energy through them, but generate electro-magnetic fields that are picked up by nearby wires. Those fields generate noise which lowers the SNR. On top of that any anomalies in the copper like broken insulation can cause additional noise. That means as copper ages it has more and more problems. As a consumer this means your service might get worse over time, even if the telco is not trying to make that happen.

This trend to keep copper has been around since FTTH has been deployed in volume in a few places. The reason is pretty simple. The less that a telco spends to deliver service the sooner it gets a positive Return On Investment. The thing is that this means that upgrades to the next level of performance require new network construction. With FTTH, the fiber that is laid is useful for much higher rate that people deploy today. It is (at worst case) a need to change the equipment on both ends or (in the best case) a provisioning change to allow the user to have access to more bandwidth. At some point the amount of money to extend the life of copper will greatly exceed the cost to lay fiber. Part of the reason is that to get very high rates means that most of the distance to the telco Central Office has to be fiber anyway. If they have to replace copper because it is bad, it will be time to move to fiber. So, slow but sure copper will leave our local outside networks. In building, it will probably be there for a very long time. As copper goes away it will be easier to build, operate and upgrade high speed access networks.

And that is what consumers want! Jim Sackman Focal Point Business Coaching Business Coaching, Executive Training, Sales Training, Marketing

Change Your Business - Change Your Life!

Net Neutrality Friday

This week had a couple pieces of news of interest. First, AT&T closed its deal with DirectTV. This is of interest from a video standpoint. AT&T bought a significant number of Video Customers outside its core territory. On top of that, DirectTV can be bundled with DSL in rural areas of AT&T to provide triple play services. I think it is also a hedge against a network upgrade to Fiber To The Home (FTTH). In most homes, video and video streaming takes up a huge portion of the bandwidth that a consumer uses. If you can get them moved over to Satellite, then U-verse moves from being a 25 Mb/s shared between video and data to a 25 Mb/s data service. This could be viewed as a network upgrade for consumers as they no longer have to share their bandwidth. This whole activity gives AT&T a huge leg up in the video distribution business. The challenge is that video distribution is a not a high gross margin business. What we have yet to see from AT&T is any move into the content side of the business. Verizon bought AOL for its advertising network, which is at least the way to monetize content. I suspect AT&T is not done buying assets because of that.

Second, there are upcoming changes to the Re-transmission Rules. These rules are around the rights and rules for carrying local broadcast stations on Cable Systems and other Multi-Channel Video Programming Distributors (MVPDs). As you recall, these are the rules that got Aereo put out of business earlier this year. Many governments require that local MPVDs carry local television broadcasts. There is a need to negotiate a price that the MPVD pays for this privilege. Essentially the local TV station gets paid for the delivery of the content that it provides for free Over The Air (OTA). If you go back 30 years, there were TV antennas everywhere on rooftops. Now they are extremely rare to see. That is because most people get their local TV broadcasts from their MVPD. This has been an awesome deal for local TV stations as they got a bunch of money for not doing anything new. Expect a lot of back and forth through the law and regulatory process. The FCC is starting an Notice of Proposed Rule Making (NPRM) on this topic. If you remember it was an NPRM that caused the whole "Fast Lane" controversy. Stay tuned for lawyers to make a lot of money!

Finally, there was an announcement around Software Defined Networks (SDN) by AT&T. AT&T announced that its "Network On-Demand" Carrier Ethernet Service has cut provisioning times by 95%. This is an interesting case study on how automation around SDN might save a lot of time for specific services. In the old days, Flow Through Provisioning (see OSMINE) was a traditional way of issuing work orders. The more standard implementations that can be done with services the better that this will be. One thing that I want to note here is that this was implemented on a specific service. This makes sense as it will be too complicated to implement a multi-service SDN at this time. It makes sense to pick a simple single service with few changes possible (in this case bandwidth) and learn about how the automation works. I am not sure how much money this saves AT&T. Probably not much. But it is a live technology trial at a Tier 1 carrier that is making money on a service that customers want.

Have a great weekend. Jim Sackman Focal Point Business Coaching Business Coaching, Executive Training, Sales Training, Marketing

Change Your Business - Change Your Life!

Net Neutrality Friday

This week there were several chunky pieces of news. Most of relates to the conversion to the All IP network. That title itself is a bit of a misnomer because one of the topics covered were copper facilities. There has been significant work on IP Voice Network Interconnection. That looks to be moving forward in a straightforward manner. There is not really a policy issue here as it is more of a technical issue than a business issue. Carriers will want to connect their VoIP networks and there is likely not a big advantage to any specific vendor. Because of that the topic does not have a lot of contention. Much more of a contentious topic is the notice required for the retiring of copper networks. Consumers will get 3 month notice. Competitive Carriers will get 6 months and equivalent services must be offered on the fiber networks. I think this actually presents a bigger challenge as the wholesaling of bandwidth has really only been defined under the old Project Pronto. If a competitive carrier leases facilities (aka UNE-L), there is often no direct equivalent in Fiber networks. Many such networks use Passive Optical Network technologies. These networks have a shared physical infrastructure between many end customers. Copper networks provided dedicated facilities. There is no rational way to lease a fiber unless a point to point fiber construction plant has been created. All of this will create challenges until things are understood clearly from cost, price and service standpoints. This could slow fiber deployments theoretically, but I suspect that it will not do so much. I think we will see a lot of whaling and gnashing of teeth on this one, but nothing really important will happen.

Another topic this week was a way to look into cost modeling rural broadband applications. The challenge here is that the model has been being made for the Rate Cap (read Larger) carriers and applied to the Rate Base (read Smaller) carriers. Rate Base carriers follow the older model of how carriers get paid. In the olden days, telephone companies got paid a fixed rate of return based upon how much money they spent on equipment. This became known as the rate base. Because of this, telephone companies were a very stable investment for shareholders. Most carriers converted to Rate Cap carriers later to be able to get a better stock price and remove some other regulations lessened. In this case, the charges of the carriers are capped and they can spend whatever they want to make that revenue. Very small carriers are sometimes still Rate Base carriers and are subsidized by the US Government to make their revenue. They don't charge up to their Rate Base (in other words their customers don't cover the telephone company's costs). Instead, the gap between what they can charge (same as the Rate Cap carriers) and what they spend is covered by the US Government. The problem is that these new cost models are for the larger carriers and won't work for the smaller ones.

My opinion is that these very small carriers probably don't make sense in today's environment anymore. We need to be looking to the States to help them combine so that they can have enough bulk to become Rate Cap carriers. This can be done with other small carriers or potentially rural spin-offs of larger carriers. However, there will be resistance to eliminating these historical part of the landscape. Jim Sackman Focal Point Business Coaching Business Coaching, Executive Training, Sales Training, Marketing

Change Your Business - Change Your Life!

Net Neutrality Friday

The rules came out yesterday from our friends at the FCC. They are complicated and a long read and I am slogging through them. The place I want to start is where the rules say that they do not cover and that is Interconnection Agreements. In the vernacular, these are called Peering Agreements. In the public view, this is where things all started. Comcast and Netflix. In the industry, it was based around the overturn of the last set of rules. But John Oliver and the broader public think of Interconnection. Now, I think the notion of Peering is a good one. In particular, the notion of a "peer". The noun peer has a definition of: a person of the same age, status, or ability as another specified person. In the world of the Internet, it meant that traffic would flow between two entities and on about equal basis.

Just as a counter example, let me bring up the phone network. In the older world of the telephone, the company where the call originated collected the money. It passed along to all the companies involved in delivering the call to its destination any money owed along the way. That way the resources used to deliver the traffic was paid for on each step along the way. The Internet works differently (in general) here. The originator of traffic pays their service provider. Intermediate ISPs are not paid for traffic that transits their networks. As long as traffic is relatively symmetrical, there is no problem. It is a "bill and keep" model.

Along comes Netflix. People watch videos in bulk from Netflix and on Youtube (which is owned by Google). Netflix hired Cogent to create a Content Distribution Network and send out the requested videos to customers. This created a massive imbalance. Netflix was a huge source of traffic and almost none was being sent back. Many of the other large sources of traffic, like Google, have significant direct relationships with the major ISPs. Netflix did not and this caused a significant amount of tension, most notably with Comcast. Given the popularity of Netflix, this is what caused all manner of distress with the public.

But now, this kind of thing is NOT going to be covered by Net Neutrality. I suspect that we will see this being a problem again, but not for a long time. As I posted a few weeks ago, it turned out that Cogent itself was the cause of the problems seen by Comcast customers with Netflix. So, not regulating these agreements is probably the right way to go for now.

Now we can start talking about the rules as written and the other end of the problem: Universal Broadband Access.

Jim Sackman Focal Point Business Coaching Change Your Business - Change Your Life! Business Coaching, Sales Training, Marketing Consultant, Behavioral Assessments, Business Planning