Expert: Governments are moving data to the cloud (with related video)
GPN reached out to Greg Arnette to find out about cloud archiving trends in government. Arnette is founder and CTO of Sonian. The firm’s cloud-based archiving offerings preserve and protect an organization’s information, making it searchable and actionable. Greg Arnette’s views are below.
GPN: What are the main ways that governments are using the cloud today?
Greg Arnette: Most of the government prospects that we speak to seem to look at the cloud as a large, secure and inexpensive disk drive, and they’re not wrong. The cloud is a great solution for migrating legacy data from expensive (and aging) on-premises storage media. Securing it in the cloud means generally higher availability and reliability at a fixed and predictable cost.
GPN: How will governments use the cloud in the future?
GA: This may be a bit farther down the road than for their corporate counterparts, but I have no doubt that government entities will adopt, on a large scale, the construction of applications on the cloud, either as replacements for “production applications” they are supporting on other platforms, or brand new applications. The ease of development and ease of deployment simply can’t be matched, so I do feel that it is a foregone conclusion—the timetable is the only question mark. You can’t compare conventional “waterfall application development” with a dynamically scalable environment in which you can “burst test” a prototype and then scale down those stacks when the test is complete.
GPN: What cloud applications offer the most benefit to local and state governments?
GA: A lot of state and local governments have recently come under fire for deleting e-mails that were later needed for investigation or even litigation. Because this information “belongs to the citizens,” policies that dictate deletion are increasingly becoming flash points. With cloud archiving, there is no reason to delete e-mail, and simply no excuse for not being able to produce it. Added to this transparency is the fact that migrating this data to the cloud often presents the cheapest and most secure storage option.
GPN: Does your client have any advice for local government officials on implementing an IT setup that includes a cloud component?
GA: Start small is always my advice for any new platform. Find a low-risk application or simply start to journal on-premise data to the cloud before taking on a project to migrate all of your production applications. Moving to the cloud should be on everyone’s 12-month plan, but moving there intelligently is your best bet. Get familiar with cloud monitoring and management services with low-risk apps before having to depend on your ability to monitor load balancing and its cost implications.
GPN: What is really important in the process that governments need to focus on?
GA: Importing legacy data can be time-consuming and expensive. It is important that government entities weigh the benefits of that spend against the need to get off decaying local storage. The cost benefit of the cloud can be completely wiped out if the first step is to ingest two petabytes of legacy data. De-duping the data locally, while it will take longer, means a smaller ingestion queue, which will cost less. It is also possible to negotiate lower ingestion rates by agreeing to perform this task over a longer period of time and during “off hours” when commute is less costly.
GPN: Thank you, Greg Arnette, for your views.
In the video, Brendan Wright of Sonian talks about cloud-based solutions on the horizon.