What kind of IT infrastructure do today’s businesses need to support the rapid deployment of new applications and services?
Flash-based storage and infrastructure in next-generation data centers are capable of providing the flexibility businesses need. While flash used to be deployed for point solutions or to accelerate one application that needed rapid response time, flash technology is “increasingly prevalent with any data center,” said Troy Brick-Margelofsky, storage solution architect team lead at CDW, during a recent CDW webinar sponsored by NetApp.
Modern data center architectures, especially those using flash, can provide data protection, disaster recovery and duplication, and quality of service, noted Stephan Stelter, a storage solution engineer at NetApp. All of those functions are necessary to keep up with the DevOps model of development that iterates constantly, deploys apps rapidly and continuously improves services for customers.
Public cloud providers can offer DevOps tools to companies today, Stelter noted during the webinar. “That ability to very easily and quickly consume a new paradigm, a new way of developing, a new way of interacting with our end users has raised the bar for IT,” he said. “We need to provide ways to service our business customer in a similar fashion.”
There are aspects of the public cloud that non-cloud IT providers can deliver upon, Stelter said. “And the infrastructure that we build as data center architects needs to be able to support those,” he added.
Whether users are consuming from the cloud or accessing services internally, they need predictable performance. “You need to be able to provide the scalable architecture in terms of capacity, as well as in terms of performance,” Stelter said. “As the need increases or demand increases for any application, you’re able to meet that demand very easily.”
Above all else, companies need data protection for metadata and stored data. “You don’t have anything if you don’t have your data,” Stelter said. “You need to make sure the data that represents your organization is protected and available.”
Data services such as snapshot, replication, deduplication and compression are also important, he said.
Snapshot would allow a company to revert to a previous state if they are attacked by ransomware. Replication can rapidly reproduce a company’s database in another location so that it can be recovered in the event of a disaster. Automation helps solve the challenge of “how do we get services delivered in a timely fashion,” Stelter said.
In the DevOps model, development and production streams run closely together. “In order to run that as efficiently as possible, you can run it on the same IT infrastructure hardware,” Stelter said. “Many organizations are doing exactly that. In order to be successful in doing that, you need to have quality of service in place.”
That QoS extends to networks, hardware and the way applications interact with the underlying data, he said. Companies can set minimum and maximum levels of service, as well as allow for bursts in capacity.
Stelter highlighted the benefits of NetApp’s all-flash SolidFire infrastructure. It’s a node-based architecture that delivers “extremely high performance” by using solid-state drive disks, he noted. It provides a “scale out” architecture that allows businesses to easily add nodes of varying capacities into the same cluster and then scale them out to improve the performance or capacity of the environment.
“The infrastructure was designed with a service provider mentality,” Stelter said. “It allows you to put a new node in and take an old node out, and do these things without disrupting service at all, without disrupting the performance at all.”
The self-healing nature of the architecture allows companies to easily tolerate drive, component and power supply failures, he said. Additionally, automation is part of the way the system works, allowing companies to run multiple versions of code streams or let developers request a fork in the code. “They are able to do that and carve up a new volume with just a line of text,” he said.
The cost of flash-based arrays is falling, he said, and the global deduplication capabilities of SolidFire, with compression and thin provisioning, allow companies to be more efficient with their storage resources.
“That’s something that is on by default in these systems to address the need to optimize and really drive home this idea of, ‘How can I do more with less in my architecture?’” he said.
Brick-Margelofsky noted that as flash prices drop and become on par with other storage methods, “it just becomes more efficient to adopt flash into the data center for most every workload.”
Why? Flash is slightly more expensive than other storage methods today, compared to a more significant gap a few years ago, according to Brick-Margelofsky. That price difference, while it still exists, can be made up in savings, because companies that adopt flash do not waste administrative or engineering time figuring out the best location to place certain workloads, he said.
Tiered performance, he said, is a problem “that we have solved by utilizing flash and utilizing that technology. And we can go work on things that are more important to the business, because performance just becomes an issue that we don’t even have to think about anymore.”
For more information on the changing nature of technology deployment, check out, “Businesses' Infrastructure Must Help Them Rapidly Adapt to Changing Markets.” And for more on the benefits of flash storage, check out the CDW webinar sponsored by NetApp.