Larry Freeman is a senior technologist in the Office of the CSO at NetApp.
A frequent speaker and author, Larry's current role at NetApp is educating IT professionals in the latest trends, techniques and technical best practices in data storage technology. He has authored many technical papers and the book "Evolution of the Storage Brain". He hosts a popular blog "About Data Storage."
Larry has held various technical positions during more than 30 years in the data storage industry. At Telex Computer Products, he was responsible for maintaining mainframe disk and tape drives at key field sites. At NEC Information Systems, he designed disk drive quality measurement criteria for major OEM customers. At Spectra Logic, he was the senior product manager for enterprise tape libraries and at NetApp he has helped demystify technologies such as deduplication and storage efficiency.
This is a weblog of Larry Freeman. The opinions expressed are those of Larry Freeman and may not represent those of Computerworld.
By now, you’re probably familiar with IT’s new business mindset: that of a highly efficient service provider. Cloud services, consumerization and BYOD are just a few of the business-driven topics that are currently generating buzz.
However, another major trend has begun to silently transform the data center by providing IT equipment with an intelligence all its own. It turns out that, in order to truly modernize IT, your application servers, networks, data storage systems, and even end-user devices must live intelligently—often, independently.
What do I mean by that? Read on...
Are you fed up with reading 2014 predictions yet? Each year around this time, pundits in the tech sector make predictions about which transitions will occur in the coming year.
Rather than adding to the long list of prognosticators making guesses about what will happen in 2014, I thought I'd take a slightly different tack: Here are five things that won’t be happening in the data storage industry during the coming year.
Think hard drives will go away? Think traditional, monolithic arrays are safe?
Think again. I’ll be busting these and other myths today...
Cloud has proven to be a sticky marketing term, even (especially?) among non-technical consumers. There is a growing consensus that “the cloud” promises a quicker and cheaper way of purchasing IT than the traditional, build-it-yourself model. This in turn has led to internal debates between application developers and infrastructure teams on the best approach for implementing “the cloud” within IT organizations. (And of course, IT architects know that when they hear management say, “We want to move to the cloud,” they will be the ones who have to figure out how to do it!)
It's fair to say that big data has experienced more than its share of hype over the past year. According to Gartner's 2013 Hype Cycle for Storage Technologies, big data is approaching its peak of inflated expectations, which means that it will soon be headed for the inevitable plunge into the trough of disillusionment. Here's my big-data year-in-review, plus four top tips for IT success in 2014.
2013 is the year that flash became 'standard equipment' for mainstream storage arrays. As the year comes to a close, it's hard to envision a new enterprise-class array being deployed without flash in some shape or form. Here's my flash year-in-review, plus five top tips for IT success in 2014.
Enterprise storage architects have developed many ways to increase application performance by implementing flash memory. This has raised the bar for IT professionals, who need to be able to understand the differences in order to evaluate their options. So here's a handy guide, including use-cases and caveats for typical applications.
The start of this series covered two trends that IT workers should be aware of: The mega-shifts to cloud computing and to the software-defined data center. Make sure you’re fully prepared for these big shifts in technology that are reshaping today’s enterprise data center. Today, I’ll talk about the third trend: A fundamental shift in focus within storage systems themselves—moving from spinning hard disk drives (HDDs) to flash.
Do you have an IT job involving enterprise data? More importantly, do you want to keep it? In Part 1 of this series, I covered the mega-shift to cloud computing, which has begun to reshape IT. The second trend has received little media attention but is no less profound. This is the mega-shift from human management to software automation occurring within the walls of IT. This is the story of the software-defined data center (SDDC).
Do you have an IT job involving enterprise data? More importantly, do you want to keep it? If you answered “Yes” to both questions, you’ll want to make sure you’re fully prepared for three big shifts in technology that are reshaping today’s enterprise data center. In the next few blog posts, I'll talk about these "mega-shifts," such as the software-defined data center and flash SSD storage. But in this first blog post, I’ll discuss cloud computing. Embrace it or risk being outsourced...