Subscribe to Active Archive Alliance updates - blog posts, newsletters, and more!
Over the past year, new storage technology innovations for active archives have enabled organizations to gain reliable access to all of their data, all of the time. As a result, these organizations have experienced increased cost savings, decreased energy consumption and improved storage administrator efficiency.
The key drivers that will impact the continued use of active archive in the future include the decreased cost of flash storage, greater automation, the rise of tiered storage workflows and the growth of tape in public and private cloud infrastructures.
Members of the Active Archive Alliance recently shared their perspective on the outlook for data storage and active archive in 2017. Here is a list of the top trends to watch:
Automated Policies and Artificial Intelligence Come to Storage
Greater automation and the use of artificial intelligence (AI) will simplify storage management and the use of active archives. New technology will help resolve two of the greatest challenges facing data management - data classifications and storage classifications. New software tools will use metadata to power automation providing a simple solution for data management.
Tiered Storage Workflows on the Rise
As the cost of flash storage decreases and disk continues to struggle to maintain its cost and capacity curve, customers will look to adopt tiered storage workflows. Flash’s useable cost could drop below $1.00 per gigabyte while tape in large systems will cross below $.06 per gigabyte near line and below $.03 per gigabyte for offline or cold storage.
Energy Consumption Challenges Drive Active Archive Adoption
Active archiving will get a boost as organizations continue to seek ways to reduce energy consumption - a significant and growing component of operating expenses for today’s rapidly expanding data centers. With servers and HDDs consuming more than 30% of the energy required to run IT hardware, data center managers will look for ways to reduce utilization of power-intensive hard disk drive technology. Less frequently accessed data will be moved to an active archive where it will remain accessible but consume less power on more economical storage tiers such as automated tape libraries on premises or in the cloud.
Onramps Drive Growth in Tape Usage
Tape usage will grow substantially as a key component of public and private cloud infrastructures for cold and active archive data. Tape’s inherent attributes of low-cost, reliability and even portability combined with the increasing availability of file and object-based onramps to tape will accelerate adoption of tape beyond its historical role as a target of backup and recovery software. Solutions that offer data management intelligence and that integrate well with storage targets will alleviate historical management burdens associated with tape automation deployment, further fueling tape’s penetration of cloud infrastructures where long term preservation and access to that data are required.
Cloud and Object Storage Bring More Flexibility to Archived Data
Object storage software can transform an archive into an active archive that is positioned between high-performance storage and tape. A combined cloud and object storage infrastructure provides additional capabilities allowing users to collaborate at LAN speeds and access data from any device, anywhere in the world. As content continues to grow, an active archive can scale seamlessly to billions of objects in a single namespace. Flexible, user-defined data protection options will be key in making this a reality.
Ethernet Continues to Gain Market Share
Ethernet is winning. There are still a number of ways to connect storage and active archives to compute - Fibre channel, Ethernet, SAS, SATA, and InfiniBand. Even though some of these connections are lighter weight from a protocol standpoint, Ethernet will continue to gain market share as the external host connection means.
The following Active Archive Alliance members contributed to this list: DDN Storage, Fujifilm Recording Media USA, Inc., Spectra Logic, Strongbox Data Solutions and Quantum.
By Rich Gadomski
I recently returned from the great city of Boston where I had a lot of fun with the locals who have a “wicked” sense of humor and a funny accent I can’t imitate being from New York. I was in town facilitating our 8th Annual Global IT Executive Summit. The theme this year was “Exploring the New World of Storage.” Boston was a fitting location given its history in shaping our country and the storage industry too. Speakers from the analyst, vendor and end user communities presented on the latest trends that are emerging in a new world of storage driven by so many innovations in flash, disk, tape, cloud and data management software.
One subject that kept popping up was the need to control runaway costs associated with unrelenting data growth that’s compounded by long-term retention requirements. A common solution that many speakers referenced was the need for a well-planned tiering strategy where data moves as it ages from expensive tiers of primary storage to secondary, tertiary and even the cloud as a fourth tier. Speakers presented on various solutions to manage data growth with long-term retention requirements stemming from compliance regulations, protection of business assets and content, disaster recovery and big data analytics.
A key concept that got a fair amount of the spotlight in the tiering conversations was active archiving solutions. In an active archive, long-term data remains cost effectively online and easily accessible by leveraging innovative and integrated solutions that intelligently manage data across flash, disk, tape and the cloud. This is really where cost savings can come into play by matching data types to the right tier of storage.
It was pointed out by more than one speaker that an active archive solution can be implemented with existing storage equipment and without changing workflows while being transparent to the end user. Since we can’t predict when archival data will be needed again and with that data being kept for longer and longer retention periods, what’s needed are solutions that are cost effective and automatically migrate data from generation to generation of low cost yet reliable media such as LTO tape.
It was also noted during the conference that with today’s advanced tape solutions, long-term affordable active archiving is a reality. Tape now offers: the lowest TCO for long-term storage thanks in part to its high capacity and very low energy consumption; the best reliability as measured in bit-error rate; transfer speeds greater than HDD at 360 MB/second; and a long archival life of 30+ years. Speaking of capacity, the potential for 220 TB on a single tape cartridge based on Barium Ferrite technology has already been proven. Barium Ferrite will enable achievement of the tape technology roadmap plans well into the future and will support active archiving for many decades to come.
Now if we can just get the Bostonians to pronounce the “r” in active archive, we’ll be all set!
Join us for an exciting webinar series presented by leaders from the Active Archive Alliance. Our goal is to align the education and technologies needed to meet the rapidly evolving requirements for data archive.
Alliance members strive to extend solutions beyond the high-end supercomputing and broadcast markets to the greater general IT audience that is in need of online data archive options. The following three-part series will help to educate and promote active archive strategies for data storage.
Thursday, June 2 at 8am Pacific/11am Eastern
Object Storage May be the Cloud You are Looking For
- Why use object storage?
- Learn more through examples from organizations currently using object storage in an Active Archive solution.
Hosted by: Quantum/HGST
Tuesday, June 7 at 8am Pacific/11am Eastern
Best Practices in Leveraging Active Archives to Solve Data Protection and Cloud Requirements
- What is block, file, object?
- How is this defining new approaches to active archive and object storage?
- What considerations should customers take into account when implementing?
Hosted by: Spectra Logic/DDN Storage
Tuesday, June 21 at 8am Pacific/11am Eastern
MLB Network Hits Home Run with Active Archive
- What is active archiving and why is it important today?
- A quick lesson: backup vs. archive - what’s the difference?
- Technologies used for archiving: tape, cloud, disk, flash and data movers.
Hosted by: Fujifilm/StrongBox Data Solutions
Register now to learn more about how an active archive can give you access to all your data, all the time: http://bit.ly/1R6nUk2.
In today’s media and entertainment industry, workflows need to seamlessly integrate primary storage and application platforms to deliver performance and accessibility. Active archive has become invaluable for repurposing assets that already exist and require large amounts of storage. And, we’re seeing continued innovation in the use of active archives for long-term data access and preservation.
All of the Active Archive Alliance members will be showcasing their solutions for active archive at NAB 2016 next week in Las Vegas. Here’s a peek at what each one has planned and where to find them:
DDN Storage (Booth SL8016)
DDN Storage will showcase how its solutions like the high-performance MEDIAScaler™ Converged Media Storage can modernize your entire workflow in one easy step and deliver much more value and profitability to your organization.
Fujifilm Recording Media USA, Inc. (Booth SL7613)
Fujifilm will be exhibiting its Dternity solution, including new features like Partial File Restore and the Dternity VM – the world’s first virtual machine active archive. Additionally, Fujifilm will be discussing their new Data Migration Services aimed at helping customers break their archives out of the past while integrating new technologies.
StrongBox Data Solutions, Inc. (Booth SL7613)
Visit the StrongBox team in the Fujifilm Dternity booth (SL7613) and discover how to simplify your workflow for long-term content preservation. StrongBox is giving away a PETABYTE of storage, so be sure to stop by the booth to enter.
Hewlett Packard Enterprise (Booth SL2425)
Hewlett Packard Enterprise will be demonstrating the HPE StoreEver Tape archive solution with the latest LTO-7 tape drives. This solution allows for M&E customers to easily integrate the cost effectiveness and reliability of LTO tape as if it were disk into their workflows for long-term archive of their media assets. Come see the solution in action in the StudioXperience booth (SL2425).
HGST (Booth SL9721)
HGST will showcase its 4.7PB Active Archive System; an object storage system that transforms silos of data storage into cloud-scale active archives, featuring its innovative helium-filled 8TB drives. Visit the booth to enter a daily drawing for a G-Technology 1TB G-Drive, or play the HGST Partner Passport Program game for a free t-shirt. Tweet a photo wearing your shirt to enter a daily drawing for an Amazon Echo.
Quantum is announcing new partner integrations with both StorageDNA and Marquis to deliver comprehensive AVID ISIS archival solutions. These active archive capabilities enable ISIS administrators to manage their storage effectively, offloading completed or stalled AVID projects to Quantum Artico, Scalar tape libraries, Lattus object storage and Q-Cloud for near term and long term content retention and protection Stop by Quantum’s booth (#SL8416) to learn more about this and many other exciting demonstrations.
Spectra Logic (SL11816)
This year at NAB Show 2016, Spectra Logic is a finalist for the IABM Game Changer Awards, will be giving a presentation about ‘Genetic Diversity’ April 20th at 10:00am PST, and has products on display in several partner booths. Interested in hearing about our hybrid storage ecosystem for media and entertainment? Come by Booth #SL11816 to learn how Spectra’s deep storage solutions utilize active archive to grant users access to multiple tiers of highly scalable and affordable storage.
Be sure to visit these member booths and ask how an active archive can help you more easily manage and preserve your digital media assets.
by Shreyak Shah
Data from all industries is growing with massive amounts of structured and unstructured data that is expected to quadruple in size by 2020. From Life Sciences, Media and Entertainment, Video Surveillance, and Oil and Gas to cloud providers many organizations are seeking ways to add more storage capacity and performance while still keeping their ‘archived’ data accessible and secure. One approach that is seeing a lot of traction is active archive, which is beneficial in terms of accessibility, flexibility, security and scalability and is ideal for organizations with data that requires long term retention and fast, easy retrieval.
Archives are undergoing a radical transformation, fueled by rapidly growing file sizes and new access requirements. Object storage enables IT managers to archive content sooner reducing the cost of tier-one storage. This preserves opportunities to monetize content while increasing access and speed at a substantially lower price point than traditional RAID disk-based solutions (and not much more than tape).
All too often businesses have had to offload months to years of historical transactions to secondary or tertiary data storage repositories. With the increased volume of data sources and speed of new data creation, keeping pace with these repositories has become a daunting task for any size of organization. More and more organizations require an active archive that allows for easy access to all pools of storage permitting the management of limitless volumes of data for better intelligence, competitive advantage or for regulatory requirements.
Object storage software can transform an archive into an ‘active’ archive that is positioned between high-performance storage and offline tape. The active archive solution brings it all together -- cost-effective spinning disk storage or online tape libraries, a collaboration platform, online content distribution and data protection. It also provides the foundation for a resilient object storage platform that delivers the highest level of data durability in the industry, surviving an entire data-center outage when deployed across multiple sites.
CTOs do not wish to purchase two to three years of storage in advance as it often leads to over budgeting and under utilization. Power, cooling and floor space are all wasted with this model. Object storage enables administrators to buy, manage and deploy exactly what is needed and to scale the performance and capacity as the organization expands.
It also allows an organization to take control of its content and easily create and manage public, private or hybrid clouds that sit securely behind a firewall. Administrators can also economically and efficiently manage capacity of up to double digit Petabytes with just one full-time employee and enjoy the benefits of the system’s self-healing capabilities. As content continues to grow, an active archive can scale seamlessly to billions of objects in a single namespace. Flexible, user-defined data protection options are key in making this a reality.
Implementing an archive active creates incremental revenue opportunities and provides the ability to convert what has historically been a cost center into a revenue-producing asset. A combined cloud and object storage infrastructure provides additional capabilities, allowing users to collaborate at LAN speeds and access data from any device, anywhere in the world.
By Rich Gadomski
I recently attended the Storage Visions 2016 Conference to participate on a speaker panel entitled: “Saving Data Forever: Long Term Content Preservation and Archiving.” The panel was in agreement that “forever” is a long time especially in the world of IT storage. While we were hard pressed to predict what storage technologies would be available 5,000 years from now as one attendee asked, our advice was to put an active archive in place that can routinely manage the migration of data from performance tier to economy tier and from older storage formats to new ones. The benefits of old to new migration typically include better performance, reduced footprint from greater density and lower total cost of ownership.
In a typical active archive environment, data management software migrates data by policy from expensive primary storage tiers to more cost effective tiers such as tape while maintaining the convenience of online file access to all of the data. Data can also auto migrate from older tape formats to new formats within a tier. Take LTO tape as an example; you can upgrade your LTO-5 drives to LTO-7 drives and auto migrate data in this tier from LTO-5 media to LTO-7 media. In doing so, you will get the benefits of an easier conversion with a big jump in per cartridge capacity from 1.5 TB to 6.0 TB, a much faster transfer rate going from 140 MB/sec to 300 MB/sec, thus reducing the amount of drives and robotic library slots required, and giving you more room to grow.
The ability to migrate is key to keeping up with relentless data growth for the long term. That is why having a reliable technology roadmap is so important. The LTO roadmap has been extended from eight generations to ten generations. Currently, the newest generation in the market is LTO-7 with a native capacity of 6.0 TB. Generation 9 and 10 have been added to the LTO roadmap where we can expect native capacity in generation 10 to rise to an impressive 48 TB, eight times greater than LTO-7, with an impressive native transfer rate of 1,100 MB/sec. These new generations will provide the ability for ongoing migration necessary to keep up with data growth and will ensure backwards compatibility with the two previous LTO generations as usual. No other data recording technologies can present a roadmap that has this much capability to look forward and plan for comprehensive data archiving.
Is generation 10 the end of the roadmap? Not likely, as IBM and Fujifilm announced back in April of 2015 the achievement of a new record of 123 billion bits per square inch in areal data density on linear magnetic tape using Barium Ferrite particle technology. This equates to a standard LTO cartridge capable of storing 220 TB of uncompressed data, 36 times greater than LTO-7 capacity! Given this achievement, the new LTO roadmap should be easily achieved and extended beyond generation 10.
This is good news for your migration strategy and long term, cost-effective active archiving!
By David Cerf
More than 50 years ago, IBM and General Electric were grappling with computing lab’s needs to process more than one computer function at a time. The pioneers of virtual machines needed a way to share mainframe resources with disparate users. Today, you can hardly glance down a list of IT news without seeing “virtual machine” or “virtualization.” Businesses figured out it was much easier to be able to share resources for various functions instead of having separate, physical systems for every different process.
Why a VM-enabled Active Archive?
Scaling storage typically means adding another server to your rack, filling it up and the cycle continues. For businesses without the luxury of real estate, another rack might not be possible – you can only stack so high.
With a VM-archive, you can use a few GBs of space on an existing server plus a tape library or even the cloud to build a scalable active archive. This can be especially vital for businesses with multiple sites or multi-tenant architectures. The VM archive should be able to manage both file and object-based storage to support massively scalable workloads.
Plus, we like that instant gratification. Download, install and start writing data in minutes, not days. You aren’t stuck waiting for a box to ship to you, and there’s no additional hardware maintenance needed.
Upgrade, but don’t Overhaul
If you’ve ever done any remodeling, you’ll know that it’s a lot less expensive to repaint the rooms and replace appliances rather than tear down the whole house and start from scratch. The same is true for deploying a VM active archive. Adding a virtual machine means you don’t have to mount new hardware or send existing servers to their deathbeds. Instead, re-use and repurpose to save money and simplify getting started with your archive project.
A VM-enabled active archive allows you to start small and expand the archive as you need. This way, an archiving project is easier to get started with and tackle as data continues to grow.
Let’s look at a few ways companies are leveraging VM-enabled archiving today:
- Got tape libraries? Add a VM archive to existing library and instantly create shared storage for nearline and archive.
- No local tape? No problem. A VM archive can keep active data locally and automatically create copies for off-site storage or the cloud for disaster recovery and data preservation.
- Looking to simplify multi-site storage management? A VM archive can provide on-site storage that automatically replicates to a second data center, the cloud, and multi-tenant architectures.
Our friend, Jon Toigo, has been exploring this concept in his latest video installment with Barry M. Ferrite, AI. Check out some of his videos here for more information on VM archiving.
Active archives have become a best practice for organizations that need to store and access large volumes of data. Stemming from recent technology advancements in the storage industry that have led to a variety of approaches to creating and implementing an active archive, more organizations are now benefitting from reliable access to all of their data all of the time.
Members of the Active Archive Alliance recently shared their predictions for data storage as it relates to active archives in 2016. Here is a list of the Top 7 active archive trends to watch:
The Move to Hybrid Cloud
Organizations are increasingly integrating the private cloud within their computing architectures to form stable and efficient hybrid cloud systems, which helps mitigate the limited capabilities and risks of public clouds. Cloud data centers are targeted and intermittently attacked by malware and hackers, making them one of the most threatened data centers. With the hybrid cloud, organizations can effortlessly adjust their public cloud resources to accommodate changes while they maintaining sensitive information within their private infrastructures.
Increased Prevalence of Active Archives
The declining price of hard disk drives and tape systems are making storage options more affordable and will allow for more organizations to implement archive actives. This coincides with a desire for better business insight with analytics using larger data sets and modeling to accelerate time to results. Data is becoming more valuable as the use of analytics increases and businesses want to use historical data to make better decisions. Active archiving with unified data across multiple tiers of storage allows businesses faster access to data, better insights and the ability to make more informed decisions.
Data Centers Become More Empowered
Data centers will become competitive forces used to drive business advantage because information is a major asset in today’s modern global economy. Organizations will see the huge benefit that comes from monetizing archived data storage by making most data storage active, retaining storage for longer cycles, which is the basic premise and benefit of active archiving and using scale-out storage. The Media & Entertainment industry, and Life Sciences & Genomics are among the uses cases where historical information suddenly can turn into a hot business asset.
Seamless Storage Management Regardless of Technology
New software abstracts flash, disk, tape and cloud into simple, easy to use storage that works within current user behavior (no special or custom integration required). This intelligence will blur the line between performance storage and cost effective capacity storage and it will make tape and cloud as easy to use as the current c: drive. It will also reduce back up or even eliminate backup for fixed content. Archive copies will become the new standard and intelligent storage management will automatically provide data protection (numbers of copies, self-healing function) both for local and off-site copies. And it has the intelligence to know when to store the data in flash for performance and in tape/cloud for resiliency and cost-effective storage.
An End to Vendor Lock-in
There will be a move away from proprietary solutions that create "vendor lock-in" that lock up user data with vendor dependency, i.e. silo solutions or proprietary software and hardware. As we keep data longer or even forever, we will need solutions that are flexible, vendor neutral, support completely open formats and ensure that data can be accessible now and in the future.
Expanded Role for Advanced Data Tape in Active Archive Environments
With organizations seeking to keep access to all of their data and content indefinitely, new innovations in tape technology will make this possible and affordable in an active archive environment. Increased capacity coming from LTO-7 now at 15.0 TB compressed will help reduce TCO and boost performance with a transfer speed of 750 MB per second. The newly extended LTO roadmap to generation 10 will allow organizations to leverage investments already made in LTO systems and continue to migrate archived content well into the future. Finally, tape’s role will continue to grow as a seamless part of the storage infrastructure and the cloud as it becomes easier to use as a file and object storage solution due to LTFS.
Increased Use of Object Storage
Companies will increasingly expand their active archives using object-based storage over file or block storage. Object storage systems allow relatively inexpensive, scalable and durable retention of massive amounts of unstructured data. With object storage, there is no file system hierarchy. The architecture of the platform allows the data pool to scale virtually to an unlimited size, while keeping the system simple to manage. The efficiency of object storage makes massively scalable active data archives affordable.
In addition, more organizations will adopt S3 as the de-facto standard cloud storage interface for managing large amounts of data in active archives. As organizations increasingly need to simplify and accelerate storage and retrieval for large amounts of data, extracting knowledge and information from historical deep archives as quickly and painlessly as possible, they will need a cost-effective and widely compatible object storage interface.
Active archive technologies are continuing to evolve and provide significant advantages to organizations desiring online data archives and the ability to quickly capitalize on the value inherent in their stored data – and with the trends we are seeing today 2016 should be another stellar year for active archiving.
The following Active Archive Alliance members contributed to this list: Crossroads Systems, Inc., DataDirect Networks Inc., HGST, Fujifilm Recording Media, Inc. and Spectra Logic Corp.
By David Cerf
Active archiving is centered on the convergence of various storage technologies to create a balanced, accessible and affordable method for storing data long term. But, what if you don’t know how much of your data belongs in this “archive” category? The truth is: the seemingly easiest method of use what you have, fill it up and buy more gets unnecessarily expensive. This is especially true if you’re using a high-performance storage array for data that doesn’t need those performance capabilities.
While each company may define archive differently, here are a few common criteria for “archival” data:
- Data is not accessed regularly
- Data was created more than 1 year ago and has not been accessed in more than 90 days
- Data has not been modified or accessed in more than 90 days
Sometimes companies simply don’t know how much data they have that should be archived. Fortunately, there are free tools like the storage assessment at www.freemystorage.com that can help you answer this question by showing dynamic reports and transparent views into the state of your storage environments.
With detailed results on the active state of storage and the file make-up of active/inactive data, assessment tools can help organizations understand how to free up their storage and avoid unnecessary upgrades and over-provisioning. It can help organizations save thousands of dollars each year and reduce operating and capital expenses.
Why should end users think differently about their storage?
The Active Archive Alliance thinks there is a better option to meet the growing demand for storage. Data is growing faster than budgets, driving a need for more cost-effective storage and protection. While high-performance storage keeps line-of-business data and applications quickly available, up to 80% of data on primary storage is often inactive and doesn’t need to claim expensive storage capacity. Storage needs to be intelligently balanced between active and inactive data, on-site and off, and must meet both performance requirements and budgets. Active archiving can deliver intelligent storage management that combines tuneable, user-defined policies for capacity optimization (no more over-provisioning) and will:
- Simplify nearline and archival storage management
- Free up storage capacity
- Reduce backup
- Simplify data protection
- Protect long-term content
- Improve storage costs by over 50%
Discover the benefits of reduced primary storage cost with automated data movement into an active archive solution that best fits your needs. Be sure to choose one that can transparently move files to your archive from primary storage like NetApp, Windows, Isilon and any CIFS/NFS or object storage.
Get started at FreeMyStorage.com for a free assessment to help you take control of your storage.
By Rich Gadomski
Object storage has made great strides in the decade since EMC released its Centera system. Since then, numerous other vendors have brought their own object storage systems to the market. Object storage is great for storing unstructured data, since it separates the metadata from the data so the storage system isn’t dependent upon the particular file system or block storage structure. Additionally, administrators do not have to worry about matters such as setting RAID levels or building and managing logical volumes. Lastly, from an integration perspective, object storage is a good platform for archiving because it is massively scalable, cost effective, and is able to act as a cloud infrastructure for collaboration.
However, there was one major problem in using object storage for archiving – at least until recently.
“You can't do it -- get object storage taped, I mean,” wrote Chris Mellor, storage editor at The Register in March 2012. “There is no way to get the contents of an object storage system onto tape. Instead, it has to stay on spinning disk forever.”
And since it had to stay on spinning disk, this meant continually buying more storage arrays, as well as laying out all the support, networking, licensing, power and cooling needed to keep those disks spinning.
“As the amount of data to be stored grows and grows, tape will become the lowest-cost option,” wrote Mellor. “For high-volume data archive capacities, disk economics suck, and it’s no use pretending data deduplication and thin provisioning can change that. … What is needed is a way to drain off cold, inactive objects from disk and stuff them into a tape archive. Isn't it obvious?”
Well, three years is a long time in IT, and apparently, tape storage vendors did think it was obvious that they should support object storage. A year after Mellor wrote his plea, various storage vendors began releasing tape systems that could store objects including many from the Active Archive Alliance that have released tools to make object storage feasible in a tape environment.
A good example of this is Fujifilm’s Dternity NAS which allows for both file and object storage on LTFS tape media in its active archive solution. By utilizing a standard S3 interface with an underlying RESTful API, cloud storage users can connect to Dternity directly without needing to program special calls or APIs. Active archives managed by Dternity NAS are easily accessible by CIFS/NFS or S3.
The bottom line is that tape is a viable place for object storage. This opens the door to massively scalable object stores comprising billions of graphical images, for example. Not only is it possible to achieve this, but doing it on tape, which recently demonstrated 220 TBs on Barium Ferrite media, means that it can now happen in a cost effective manner.