New Options for Off-Site Data Backup and Remote Computing
Ensuring the safety and security of your business
If we’ve learned anything in the past seven years, it’s that we’re vulnerable to natural and man-made disasters. Attacks on the World Trade Center, catastrophic hurricanes, and other tragic events have cost us dearly—but they’ve also served as a wake-up call for government and industry. As we move to a more and more paperless business, our historical data—as well as all of our daily operations—are resident on computers and storage drives, most of which are located on our own premises. How well would your business survive if something were to happen that would unexpectedly wipe out all of your data?
Most of us have experienced the pain of a hard-drive failure, but few have experienced complete data loss. One emerging, inexpensive service can help prevent a total loss of business-critical data: remote backup via Amazon Web Services (http://aws.amazon.com). Amazon has figured out that they not only have excess capacity in their vast data centers, but they also can le-verage their own IT resources to create an entirely new revenue channel by selling these services and capabilities.
This concept is the basis for a very good and green idea: server virtualization. It allows companies to radically reduce or eliminate the numbers of servers they run within their own companies, as well as the dedicated and hosted servers they have located in numerous data centers everywhere. In California, one very large utility (Pacific Gas and Electric) estimates the electrical load reduction possible in data centers by virtualizing server farms can be as much as 96% of the current usage—a number that’s hugely significant in today’s energy-starved economy. It accounts not only for a reduction in electrical usage, but also a corresponding reduction in the carbon footprint associated with that usage.
Amazon Simple Storage Service (Amazon S3) provides a simple Web-services interface that can be used to store and retrieve any amount of data, at any time, from anywhere on the Web. It gives any developer access to the same highly scalable, reliable, fast, inexpensive data-storage infrastructure that Amazon.com uses to run its own global network of Websites. The service aims to maximize benefits of scale and to pass those benefits on to developers and end users.
Amazon S3 is a very inexpensive service that provides se-cure storage and data transfer for clients. Originally conceived as an option for hosting bandwidth intensive applications, such as hosted video, it is rapidly becoming a very viable alternative for offsite disaster-recovery backup of critical company data. The prices are extremely attractive compared to most conventional Internet service providers (ISPs) and data centers. As I write this today, anyone can sign up for the service from Amazon’s main page, but the implementation and setup still requires significant technical savvy. You pay based on how much storage you have, how many requests for data, and how much transferred bandwidth you use. There are no contracts, and the fees range between $0.11-0.18 per Gigabyte transferred.
You’ll need help to set the system up, unless you have someone familiar with server and data protocols, configuration, and development. My shop is in the process of configuring S3 for all of our archival backup and company records. Amazon S3 will only be one of several alternative backup options for us.
I’ve been aware of the need to do offsite backup for quite some time, but, like most companies, my shop just hasn’t gotten around to it. In the process of researching alternatives like S3, I became aware of a much bigger trend that is growing steadily: cloud computing. Cloud computing means Internet-based development and use of computer technology. Within this framework software is provided as a service (Software as a Service or SaaS) that allows users to access sophisticated, tech-enabled services in the cloud without knowledge of, expertise with, or control over the computers, servers, software, data center or the infrastructure supporting them. Cloud computing has long been an awesome theoretical idea, but it’s now coming of age quickly.
Cloud computing incorporates SaaS, Web 2.0, and other recent, well known technology trends, where the common theme is reliance on the Internet for satisfying the computing needs of the users. Google offers Google Apps to provide common business applications (like Google Analytics for Web-traffic analysis) online that are accessed from a Web browser, while the software and data are stored on Google’s massive servers. Some successful cloud architectures have little or no centralized infrastructure or billing systems whatsoever, including peer-to-peer networks like BitTorrent and Skype.
SaaS is a model for software deployment where an application is hosted as a service provided to customers across the Internet. By eliminating the need to install and run the application on the customer’s own computer, SaaS also eliminates the customer’s responsibility for software maintenance, ongoing operation, and support.
Customers generally release control over software versions or customization requirements. Costs to use the service become a continuous expense, much like a subscription model, rather than a single expense at time of purchase. To overcome this limitation, many SaaS software platforms offer what is known as an Application Programming Interface, or API, that allows third-party customization through the API function. Many of these Web-based applications use Open Source software, which enables anyone, anywhere, to write specific code and have it integrated for their specific use.
Using SaaS also can conceivably reduce the up-front expense of software purchases through less costly, on-demand pricing. From the software vendor’s standpoint, SaaS has the attraction of providing stronger protection of its intellectual property and establishing an ongoing revenue stream. The SaaS software vendor may host the application on its own Web server, or this function may be handled by a third-party application service provider (ASP). This way, end users may reduce their investment on server hardware as well. As software development becomes more and more complex, and the need for the release of new major versions (like Adobe’s Creative Suite) becomes more and more expensive, SaaS offers both vendors and users considerable economic flexibility.
An example of this is Salesforce.com, an online sales, marketing, and contact-management application. Salesforce.com has been so successful that the company now offers Force.com as a flexible, scalable software-application platform. They realized that their basic offering goes well beyond what Salesforce.com offers. Force.com uses the basic core-software database structure based on free, open-source code. They’ve built their model using the same tools that Google has so third parties can write their own software as plug-ins to extend specific capabilities. Force.com currently offers more than 800 third-party applications that users can add to customize their sites to do virtually anything they can imagine when it comes to organizing and running a business. You can also find this same approach with Google, Yahoo, Firefox, and Amazon, and it’s emerging to a lesser degree at IBM, HP, and even Microsoft.
Cloud-computing infrastructure mostly consists of reliable services delivered through next-generation data centers that are built on computational and storage-virtualization technologies. The services are accessible anywhere in the world, with The Cloud appearing as a single point of access for all the computing needs of consumers. On the surface this sounds like it could be a security nightmare. Logically, commercial offerings need to meet the quality of service requirements customers demand and typically offer service-level agreements and specific security guarantees. The use of Open Standards and Open Source software (like Linux, PHP, MySQL, Joomla, and so forth) helps and also is critical to the growth of cloud computing.
Customers in cloud computing generally do not own the infrastructure; they merely access or rent. They don’t need to make capital expenditures that consume resources. Instead, they pay as they use the service. Many cloud-computing offerings have adopted the utility model, which is analogous to how a traditional electrical utility charges for electrical usage. In this case, there may be a basic monthly charge with additional use or utilization fees based on consumption. Other SaaS companies bill strictly on a subscription basis. By sharing computing power between multiple tenants, utilization rates can be improved (as idle servers are utilized), which is one of the main ways of reducing power, capitalization, and utilization costs.
Another advantage of the SaaS model is the speed of application development. Tremendous savings come from not having to deal with packaging, distribution, channel costs, advertising, commissions, and the associated technical support of a major version release. A side effect of this approach is that computer capacity rises as customers no longer have to manage the engineering side of the IT equation or deal with the extensive training and implementation of a new release.
The increasing availability of high-speed bandwidth also has increased the adoption of cloud computing. Users can experience the same response times from the centralized infrastructures at remote locations as they now have over their own networks. Development of new data-transfer protocols will only increase this bandwidth capacity.
Remote soft proofing, which is gaining in popularity in our industry, is a perfect example of SaaS, as it relates to a specific problem we face: getting creators, content providers, and production all on the same page visually. Not only does this kind of collaborative capability speed the production process up, but it also lowers overall costs for all parties involved. Remote soft proofing via SaaS will grow in importance for us as we continue to see digital and variable-data imaging drive the average order size down.
The biggest single objection to SaaS and cloud computing is also one of its strengths—that is, having to go outside your controlled environment to get your service or data. In today’s hacker/virus-plagued world, safeguards provided by your vendor are critical to the ongoing viability of your important content, intellectual property, customer data, and business information.
A close look at the overall state of the software and Web communities reveals that we’ll see more and more of these Web-deployed services. As transportation and energy costs continue to burden us, services like Webinars, video teleconferencing, and teleseminars will increase. Likewise, the rise of mobile computing with smart devices like Apple’s iPhone, the RIM Blackerry, and similar Web-enabled smart devices will further the convergence in this area. As long as we can connect to the Web, we’ll be able to conduct business from anywhere at anytime.
Mark A. Coudray is president of Coudray Graphic Technologies, San Luis Obispo, CA. He has served as a director of the Specialty Graphic Imaging Association Int'l (SGIA) and as chairman of the Academy of Screenprinting Technology. Coudray has authored more than 250 papers and articles over the last 20 years, and he received the SGIA's Swormstedt Award in 1992 and 1994. He can be reached via e-mail at email@example.com.