Bernstein Research has issued a research report saying it expects AWS will have an estimated $20 billion in revenues by the end of the decade. In a separate report, RW Baird & Co. projects $10 billion in revenue for AWS by 2016 and up to $40 billion in losses from the traditional IT market. The estimates reflect Wall Street’s growing confidence in cloud services and the need that analysts see in letting their customers know that a shift is underway that will lead to continued flat revenues or even losses for enterprise companies and systems integrators. In times of disruption, something like AWS may actually exceed investment analyst projections. Conversely, AWS success is not a certainty. Technologies may advance that will flatten AWS advantages or Amazon can’t scale the group’s services fast enough to keep its edge. These are the factors that investment research houses consider when making corporate financial projections. Overall, Baird and Bernstein cite a number of reasons that account for why AWS will do so well. The reasoning is sound but not without weaknesses, such as why AWS success will be harder to come by with large enterprises.
The public cloud reached a turning point last year. As Baird states in its report, the top 10 cloud providers grew 37 percent while more traditional technology companies grew by 2 percent. AWS does not break out its revenues. Bernstein, based on its own research, estimates that AWS grew 100 percent year-over-year. Bernstein estimates AWS is worth $24 billion, 13 times its approximate $1.8 billion in revenue (Amazon does not break out revenues for AWS). In contrast, Bernstein reports that Rackspace, as of April 1, trades at about 5.3 times 2012 revenue. Its revenue grew 28 percent compared year-over-year. Its public cloud service revenues of $309 million account for 23 percent of total revenues. Rackspace, it should be noted, has aggressive expansion plans of its own and has been one of the founders of OpenStack, the open cloud effort. Developer interest has grown consistently in OpenStack since its unveiling. Google is a more concerning foe with Google Compute Engine and Google Apps. And Microsoft’s Windows Azure puts the company in position to compete in the cloud space.
Enterprise Software: Baird Analyst Steve Ashley, quoted in the AWS report, “views client/server vendors like… SAP and BMC as most likely disintermediated, and SaaS companies like Salesforce, Concur, and cloud infrastructure vendors like Citrix and Red Hat as beneficiaries of the shift towards cloud computing.” The former are large, powerful companies that have built core businesses critical to IT. How these companies adapt is still a big question.
Customers will continue to invest in these providers to keep their businesses running. The time is now to ramp up and offer SaaS options, embrace mobile and offer friendly developer environments. ”Over time traditional IT will be forced to go to SaaS. The first choice will be mobile,” said Forrester Research Analyst Lauren E. Nelson to me in a recent interview.
IT Systems and Networking: According to Baird, HP and Dell have the potential to be most disrupted. EMC, VMware and NetApp also face exposure. When customers start using AWS, they stop buying storage and networking equipment. That certainly is a factor but data is so deeply embedded into these systems that analytics will give these players value over time. They also have deep histories with enterprise shops, and that dominance is not going to pass anytime soon. Realistically, AWS does have its own weaknesses in both storage and networking. ProfitBricks, for example, is leveraging its InfiniBand network and instance sizes that can scale to 62 cores and 240GB of RAM. HP is making its own play with OpenStack, as is Dell, which is also putting a deeper focus on next-generation storage and networking technology. Here’s Baird’s “competitive heat map” that shows how AWS stacks up compared to the rest of the market.
The greater danger stems from the AWS infrastructure and the breadth of what the cloud service offers. Developers get choice when they use AWS. It’s useful for test and development, as an extended network for media companies to offload peak demand. Pharmaceutical companies use AWS for its computational power and storage. According to the Bernstein report, customers that use AWS face high switching costs, because they use more services and customize their applications. And so even if there is more competition, AWS will most likely be able to protect its high margins, which Bernstein estimate at 25 percent. Amazon weathers pricing wars with the rapid addition of new features.
Importantly, the vast majority of the customers we interviewed stated that they would not change IaaS providers even for a 20% or so price discount, as switching would bring risks and, importantly, require them to invest scarce development resources, for example, to redevelop monitoring and management tools on the new provider’s application programming interface (API). While customers running relatively simple applications (e.g., the front end of a website) did not see material switching barriers, most users we interviewed have more complex, and hence “stickier” use cases.
AWS benefits from the steep demand in the market for computational, storage and networking power. It is built on a distributed infrastructure. At its core, AWS is complex due to the additional technologies needed to make an app or service truly robust. But still, customers start with AWS and often remain loyal. In turn, AWS has become an ecosystem for developers. Ask about a startup’s infrastructure and the founder will often say they run it on AWS. But AWS is not the only game in town and it still has not proven that it can provide the deep infrastructure that the enterprise requires. It’s a 10-year cycle that is underway that will keep a lot of infrastructure intact, on-premise. That gives the enterprise giants some time, but to this date they have not shown a deep willingness to embrace the hyper-scale infrastructures that customers demand.
See the article here: Investment Firm Expects AWS Will Hit $20 Billion In Revenues By 2020
Mozilla today launched the second beta of its Persona authentication system, which now allows anybody with a Yahoo.com email address to sign up and log in to any Persona-enabled site without the need to create a new account for Persona or a new login or password for the site that uses it. Mozilla plans to add support for additional webmail providers in the coming month.
With Persona, Mozilla wants to get rid of as many site-specific online passwords as possible. The goal of Persona is to ensure that users can sign in to websites without having to set up a new user account and without having to remember a new password. Users simply use their existing email account to identify themselves and the service will automatically redirect them to their webmail provider to type in their password and sign in.
Thanks to the new identity bridging with Yahoo the organization announced today, hundreds of millions of new users can now use Persona to sign in to sites that support this new system already. Mozilla also set up a demo site for those who may want to give it a try.
The Mozilla team, however, also used today’s announcement to stress that Persona is an open system and that “any domain can now become a Persona Identity Provider so users can reuse their existing accounts on any site that uses Persona.” As Mozilla noted last month, Persona is meant to give administrators a lot of flexibility. “If example.com wants to use 2-digit passwords, they can. If they want to use retinal scans powered by your webcam, they can. It’s up to them.”
Adding Persona support to an existing site, Mozilla says, can take as few as 15 minutes. Persona is not bound to any specific browser and works on both desktop and mobile (as long as you don’t insist on using Internet Explorer 7 or below).
With the release of this second beta, Persona is now also integrated into Firefox OS, Mozilla’s attempt at launching a new mobile operating system based on HTML5. The Persona team also worked hard on improving performance and the service now loads twice as fast as the last beta.
While the media continues to debate the severity of the denial-of-service attacks taking place across the web this month, they appear to have claimed another victim: payments startup Dwolla announced today that it, too, is now experiencing a distributed denial-of-service event (DDoS attack). The attack, which is still underway, began yesterday, resulting in either limited or no availability to the company’s website, Dwolla.com.
In a brief message posted to Dwolla’s blog, the company says that the event is still ongoing, and is preventing people from viewing the site and accessing Dwolla’s service. Also affected are third-party developers, who are using the company’s APIs to integrate Dwolla’s payment technologies into their own sites and services.
These developers were notified today, and Dwolla says that it’s working with service providers to resolve the issue.
Responding in the comments section of the post, the company told concerned users and developers that the consumer-facing API is unavailable at present, but as far as the company knows right now, actual fraud is not involved – that is, there’s no risk to users’ money, nor will this have affected transactions that took place before the attacks began.
“Funds are fine, and we do have our fraud team actively monitoring the entire situation,” wrote a Dwolla company representative, addressing a commenter’s complaint.
The company says that the attack is actually affecting its hosting provider, and they’re unsure at this time if it’s related to the SpamHaus situation.
One of the service providers that Dwolla is working with is CloudFlare, the Internet security firm that’s stepped in to protect a number of companies in the wake of these recent attacks. (You can see a CloudFlare message appear upon visiting the Dwolla.com domain at present).
The New York Times quoted CloudFlare CEO Matthew Prince this week, who equated the DDoS attacks to the Internet’s version of a “nuclear bomb.” Gizmodo later followed up on this report and another from the BBC, downplaying the scale of the attacks – they’re not affecting the entire Internet, Gizmodo claims.
Full text of the Dwolla.com blog post below, in case you’re unable to pull it up yourself (or choose not to, out of kindness):
Yesterday afternoon, Dwolla’s service providers became the victim of a distributed denial of service event, resulting in limited or no availability to the website, Dwolla.com.
This advanced event, still persists today, and is preventing people from viewing the website and consequently accessing its services. We apologize for this inconvenience and are working hard with our service providers to resolve the issue.
In the meantime, we will continue to update this post with more details.
(UPDATE 1:50pm CT: Third-party developers have been formally notified of the service interruption. Our team continues to work closely with service providers.)
John Sheehan is a co-founder of Runscope, an API tools company in San Francisco looking for engineers and designers to help build the future of developer tools for API-driven applications. Previously, John was at IFTTT and Twilio.
Last week Netflix announced that it was no longer going to issue developer keys for its public API, effectively ending their open API program.
This type of change isn’t unheard of. Consumer internet services (including the social networks) are increasingly moving to a private/partner API model where a more formal partnership must exist in order to use the API.
Some more recent social networks like Path, Vine and Google+ don’t even have usable open and public APIs. For these services, the traditional open API model is for all intents and purposes, dead.
Use of APIs on the whole is growing like crazy. Infrastructure providers like Twilio, SendGrid and Stripe have shaken up entrenched, crowded markets by providing better APIs. Building SaaS without an API? Good luck landing that big deal with someone who wants to do a custom integration into their legacy back-end system. Companies like IFTTT are exposing APIs to the masses without them even knowing it.
Even some consumer internet companies are getting great results from their API programs. 90% of Expedia’s business comes through their API. For eBay, 60% of listings come through its web services (and that was back in 2008, I imagine it’s much higher now). Open APIs that drive direct, mutually-beneficial transactions work.
That’s just scratching the surface on what we can actually see. By my own estimate, I believe that the vast majority of API traffic is on ‘dark APIs’ behind a corporate firewall or powering mobile applications. At a recent conference, a team from Target presented on the rapid API-ification of their internal infrastructure.
As they become an increasingly service-driven organization they’re seeing more than just technological benefits; they’re able to extract new business intelligence from analyzing their API traffic.
So why aren’t more consumer internet companies seeing the same value? In short, the interests of the developer and the API provider aren’t aligned.
Is an app that helps you manage your Netflix queue driving meaningful new subscriptions for Netflix? Probably not. Is another Twitter client helping Twitter sell and show you ads? Definitely not. When the most important transaction for Twitter was someone putting content into the network, it made sense to allow that content from anywhere. That’s no longer important to them. This is the future of Twitter APIs.
For app developers, it’s time for us to be smarter about the services we rely on in our applications. It’s no longer acceptable to depend on a third-party API provider for mission-critical functionality without taking steps to protect yourself from the whims of another company. Here’s how to mitigate the risk.
Thou shalt not freeload
For infrastructure and SaaS APIs, the relationship is clear: you pay for the value you receive either transactionally or as part of your subscription. For everything else, the provider of the API you are using should benefit equally or better from the value your use of the API is providing. If your app is not driving direct transactional value for the provider, you’re in a risky situation.
Thou shalt not forego talking to a person
An open API is a great way to test drive an integration, but it does not absolve you from the responsibility of building a relationship with the provider. If you can’t reach someone, that should be all the reason you need not to use that API.
Thou shalt monitor everything
Using a third-party API is code for your application that happens to run on someone else’s servers. Use the same level of rigor for monitoring and testing that you would for the code that runs on your own machines. When something goes wrong (and they will), have systems in place to notify you before your customers do.
APIs are an amazing tool, but they’re a means to an end. When the interests of the provider and consumer are aligned, great things can be accomplished with very little technical effort. Take care in your applications to protect yourself and your customers by being smart about what services you rely on.
Image Credit: MARTIN OESER/AFP/Getty Images
Go here to see the original: APIs are Dead, Long Live APIs
After my panel on Friday at SXSW, Paul Underwood of Deloitte and Will Lovegrove, CEO of Datownia, approached me to talk about their companies. Their viewpoints demonstrate the direction of enterprise app development and the shift to a developer-centric IT world.
Datownia’s spreadsheet-based API platform and the scale of Deloitte’s internal apps marketplace represent two trends: Datownia shows the types of tools that are emerging to solve the he complex and intricate nature of building and connecting apps; Deloitte points to the need for ways to share and organize the rush of apps that enterprise developers are creating at unprecedented rates.
Datownia offers what it calls an API-as-a-Service. It turns a spreadsheet into an API by connecting it through Box or Dropbox and then connecting it to the Datownia platform. Once created, business data or IT systems data can be shared through the spreadsheet and accessed by any number of developers.
Deloitte has built its own “App Center,” which offers 146 apps, said Underwood, who works in the Office of Technology Innovation:
We use a common RESTful web service architecture and two HTML5 front ends (one for phones one for tablets/pc). The HTML5 front ends are embedded into native containers for iOS, Android, BB, and Windows Phone (similar to PhoneGap, but our own implementation). App Center’s UI follows an Single Page App pattern which allows us to embed the UI into the container and provide as native an experience as possible.
As companies build more apps, the value of PaaS will become apparent. Companies will not build tens of thousands of apps internally on an IT infrastructure meant for email, Word docs and old-school mission-critical apps. It’s likely they will use PaaS providers either externally, internally or both to create new apps and atomize the systems of record, such as SAP for business software or Salesforce for CRM. Those PaaS providers will then push those apps to an internal marketplace or the any number of external ones out there.
For Deloitte, the app store is a way to extend influence inside and outside the company, Underwood said. Apps are delivered in a variety of methodologies. It’s a mix of internally developed apps, vendor-produced, white label, and hybrid ones made by Deloitte and different vendors.
Deloitte is so large and there are so many teams with mobile initiatives that its difficult to conceptualize of a standardized delivery model being successful. Personally, I’ve seen the best apps come from hybrid vendor-internal teams. Vendor management is becoming an increasingly critical skill for our mobile enterprise initiatives. One of the major reasons we built App Center was after realizing the explosion of enterprise app development underway at Deloitte, we needed a place to discover what was being done and better define what excellence in mobile enterprise means. An ugly pointless app on a public app store erodes our brand. App Center brings apps to the light of day, both the good and the bad. Ultimately we hope this openness drives quality.
The influence of an intensive developer-centric model gives weight to PaaS. For example, Deloitte is adding more PaaS offerings in addition to App Center to support the various member firm goals, Underwood said. “
I think the model is changing the value prop for IT in our complex corporate structure. We may have differing opinions on what PaaS means, but that topic deserves its own email.”
Abstracting the hardware of internal and external infrastructures has allowed Engine Yard, a PaaS provider, to put its platform behind the walls of a corporate data center. Oracle recently made an investment in Engine Yard, providing the IT giant with a service it can pop into a customer’s infrastructure.
As more apps get developed, complexities in the workflow is an issue that PaaS providers can help solve. Moving data from data stores or between apps is often a manual task. Datownia’s service abstracts the issues that developers have had in passing information between developers, IT and business users. But it does not solve the lack of common data standards around APIs. I caught up with John Musser of Programmable Web fame here at SXSW on Saturday. Musser said that it will be some time before we see open data standards, so in the meantime there will be smaller steps to take. REST-based APIs and JSON are examples of this.
But in the end, developers need easier ways to connect data sources so apps can have more depth. Datownia offers a way to sync data, which can make it easier to build out the apps users need. PaaS providers can serve as a data-”normalizing” environment so these new types of apps can be built. App stores are a natural fit and should become far more complex as evident in just the initial manifestations of Deloitte’s homegrown app store.
We’ll see how fast the demand is for internal app stores. As they do become more common, you can expect that the PaaS market will play a vital role and help fulfill the promise of turning the enterprise into a developer-centric environment.
See original here: How App Stores Can Become A Catalyst For A Developer-Focused IT Universe