Tuesday, December 14, 2010
Concerns with cloud backup solutions: Security and performance top list
But that architecture and delivery model makes some administrators nervous.
Read more
Wednesday, November 10, 2010
CA Inc. today upgraded its ARCserve Replication and High Availability products. Version 15.2 adds multi-stream replication for high-latency networks, a wizard to assist administrators in protecting non-Microsoft and custom applications, support for updated Microsoft applications, and integrated management-console access to Web 2.0 interactive technologies and documentation resources.
Tuesday, November 9, 2010
Data deduplication is a technique to reduce storage needs by eliminating redundant data in your backup environment. Only one copy of the data is retained on storage media, and redundant data is replaced with a pointer to the unique data copy. Dedupe technology typically divides data sets in to smaller chunks and uses algorithms to assign each data chunk a hash identifier, which it compares to previously stored identifiers to determine if the data chunk has already been stored. Some vendors use delta differencing technology, which compares current backups to previous data at the byte level to remove redundant data.
Monday, November 1, 2010
Learn the 10 steps to building a strong records retention management system that satisfies e-discovery requests and protects your firm against regulatory scrutiny.
Thursday, October 14, 2010
German-based dataglobal GmbH moved into the U.S. market this week, launching the dg suite of storage resource management (SRM) and enterprise content management (ECM) applications it claims will enable a universal content data repository.
Thursday, October 7, 2010
Virtualization capacity planning can be a daunting task. You're creating virtual machines (VMs) on the fly while moving applications across virtual and physical resources. Ensuring enough storage capacity for all these moving parts now and in the future might seem impossible.
Tuesday, October 5, 2010
Digitiliti Inc. today began shipping an upgrade to its DigiLibe unstructured data management system with the goal of helping customers incorporate cloud storage into data archiving.
Wednesday, September 29, 2010
Dexrex Gear this week expanded its instant messaging archiving capabilities, along with archiving data for multimedia messages and social media for Blackberry users. Dexrex Gear's ChatSync Mobile Enterprise builds on the vendor's ChatSync platform for capturing and routing text-based messaging into a centralized storage repository for compliance and e-discovery management.
Tuesday, September 21, 2010
Nasuni Corp. today released an upgrade of the Nasuni Filer virtual appliance that serves as a gateway to cloud storage providers. Nasuni 2.0 includes support for the Microsoft Hyper-V hypervisor, Windows Azure cloud-services platform, Microsoft Distributed File System (DFS), and Windows Previous Versions file restore technology.
Wednesday, September 15, 2010
Redundant array of independent disks (RAID) is a technique for storing the same data on multiple hard disks to increase read performance and fault tolerance. In a properly configured RAID storage system, the loss of any single disk will not interfere with users' ability to access the data stored on the failed disk.
Friday, September 3, 2010
While small- to medium-sized businesses (SMBs) might not have the resources larger enterprises enjoy, they still run into many of the same data storage problems such as capacity issues, shrinking data backup windows, data sharing and remote access. They still have similar problems managing data storage. Compounding the problem is the fact that many SMBs don't have a dedicated IT person. So it can be challenging to determine the ideal small business data storage strategy.
This tutorial can help you understand which architecture is right for your environment, whether it's direct-attached storage (DAS), network-attached storage (NAS), or a storage area network (SAN), and the key factors in choosing a VAR. Learn the best data storage strategy for your small business in this tutorial.
Read more
Wednesday, September 1, 2010
Scale Computing today introduced a clustered multiprotocol storage system for small- to medium-sized businesses (SMBs) that provides thin provisioning, snapshots, and replication as well as iSCSI and network-attached storage (NAS) connectivity.
Read more
Friday, August 20, 2010
Apple Inc. has sold more than 26 million iPhones over the last year, 14.7 million smart phones were sold in the U.S. in the second quarter of this year alone, and Cisco Systems Inc. has sold more than 4 million Flip video cameras since 2007. Because these small devices and their gigabytes of memory are also proliferating in your organization, your corporate data storage strategy needs to include mobile device management.
Tuesday, August 17, 2010
Clustered network-attached storage (clustered NAS) uses a distributed file system that runs concurrently on multiple nodes or servers. Unlike traditional NAS, clustered NAS stripes data and metadata across storage nodes and subsystems. Clustering also provides access to all files from any of the clustered nodes regardless of the physical location of the file. But how do you determine which clustered NAS system is right for you? Here are the questions you need to ask when evaluating your own data storage requirements along with the latest offerings from vendors.
Monday, August 16, 2010
Actifio Inc. is coming out of stealth this week with appliances aimed at managing data and enabling disaster recovery in virtual server environments.
On Tuesday, Actifio will launch Actifio DP for data protection, Actifio DR for disaster recovery and Actifio BC for business continuity. Actifio's founder and CEO Ash Ashutosh claims the startup's technology decouples data management from the underlying storage infrastructure to make the process simpler and less expensive.
Tuesday, August 10, 2010
Many companies strive to put together what is often called a defensible electronic records retention system to help them show attorneys and judges that electronic data discovery materials such as litigation holds, email archiving, deletion policies and more have not been improperly modified.
Christine Taylor, an analyst at Hopkinton, Mass.-based Taneja Group, discusses the critical components of a record retentions system and shares best practices for implementing an electronic data discovery policy in your storage environment.
Sunday, August 1, 2010
Virtualization disaster recovery planning tutorial
Read more
Tuesday, July 27, 2010
Geminare Inc. today rolled out Cloud Storage Assurance (CSA) 2.0, an upgrade to the cloud storage and archiving engine for SMEs the Canadian-based company sells exclusively through managed service providers (MSPs), value-added resellers (VARs), and Software as a Service (SaaS) providers.
Wednesday, July 21, 2010
Fusion-io today introduced its newest generation solid-state storage software stack, the ioMemory Virtual Storage Layer (VSL). Fusion-io characterizes the ioMemory VSL as a hybrid operating system subsystem that combines the block-level reading and writing benefits of an I/O subsystem with the virtual addressing benefits of a virtual memory subsystem.
Read more
Wednesday, July 14, 2010
Compellent SAN is added to Orlando Magic's IT lineup
Read more
Monday, June 28, 2010
Sepaton Inc. today added storage pooling and improved monitoring and reporting features for its S2100-ES2 virtual tape libraries (VTLs) while preparing to extend the platform from beyond solely a VTL to support Ethernet and file-based interfaces.
Friday, June 25, 2010
Double-Take Software Inc. this week launched Double-Take Flex for High Performance Computing (HPC), a diskless iSCSI SAN boot solution for Microsoft Windows HPC Server 2008 R2.
Read more
Thursday, June 24, 2010
Zmanda Inc. this week released the third generation of its Zmanda Cloud Backup (ZCB), adding cloud disaster recovery, support for Microsoft Server 2010 and bandwidth throttling to the online data backup service for small- to medium-sized businesses (SMBs).
Read more
Friday, June 18, 2010
Law school council IT disaster recovery plan aces its examinations
Read more
Wednesday, June 16, 2010
Verizon Business, a unit of Verizon Communications Inc., this week launched its Verizon Cloud Storage service that gives customers on-demand storage capacity; the ability to isolate their data in one or more of Verizon's five storage locations; and an application programming interface (API) to embed the service in third-party applications, such as backup and Web solutions.
Read more
Monday, June 14, 2010
Virtual tape library (VTL) vendor Sepaton Inc. today moved to beef up security for long-term data retention and regulatory compliance by adding secure data erasure technology.
Read more
Tuesday, June 8, 2010
RainStor today launched RainStor 4, the latest version of its data archiving software, adding record-level management, legal hold and deletion capabilities for compliance, as well as increasing its ingestion rate and retrieval performance.
Read more
Monday, June 7, 2010
NFS alternative meets Los Alamos National Laboratory's high-performance computing needs
The Los Alamos National Laboratory is home to the federal government's weapons research facility, which uses Panasas Inc. network-attached storage (NAS) for its research data.
However, the laboratory can't use NFS for weapons research because it doesn't scale high enough, said Gary Grider, deputy division leader for high-performance computing. So the lab uses Panasas' proprietary DirectFlow parallel client technology, as well as IBM's General Parallel File System (GPFS) and Oracle Corp.'s Lustre parallel file system.
Read more
NFS Version 4.1 (NFSv4.1) is on the horizon -- again. It has been years of predictions and promises for data storage managers, but NFS Version 4.1, which supports parallel NFS, is getting closer to shipping in NAS systems now that the pNFS spec has been approved.
What's next for NFSv4.1? How can pNFS benefit data storage administrators? Will pNFS be worth the wait? Find answers to these and other questions in this SearchStorage.com three-part tutorial on NFS Version 4.1.
pNFS spec for faster file service arrives, but NAS systems lack capable clients
pNFS and NFSv4.1 adoption still on hold with most storage array vendors
NFS alternative meets Los Alamos National Laboratory's high-performance computing needs
Asigra Inc. rolled out a new version of its cloud backup software today, adding support for mobile devices while expanding its tiering options and server virtualization platform support.
Read more
Sunday, May 23, 2010
There was no global recession in data growth in 2009, according to a Digital Universe study by Framingham, Mass.-based IDC. The technology research and consulting firm estimates the worldwide volume of digital data grew by 62% between 2008 and 2009 to nearly 800,000 petabytes (PB). IDC claims this 'Digital Universe' will grow to 1.2 million PB, or 1.2 zettabytes (ZB) in 2010 and reach 35 ZB by 2020.
Read More
Thursday, April 15, 2010
Nine Technology is getting into the crowded cloud backup service market with a platform for managed IT service providers that handles data deduplication and compression for PCs, and plans to expand to back up data on servers and eventually provide disaster recovery.
Read more
Tuesday, April 6, 2010
Quantum Corp. today launched a new high-end tape library aimed largely at enterprises with vast data archiving needs. The Scalar i6000 features an archive-tape integrity monitoring technology, higher-capacity LTO-5Scalar i2000 tape drives, and support for larger bulk exports than the model it replaces, the Scalar i2000.
Read more
Friday, April 2, 2010
While storage resource management (SRM) tools have yet to catch on in a big way, providers are starting to push them as managed services to improve storage utilization,capacity planning, monitoring, and reporting.
Read more
Thursday, April 1, 2010
Network-attached storage (NAS) has many benefits for storing and managing files, and is becoming increasingly important as unstructured data outgrows structured data across large and small organizations. But NAS management gets complicated as the number of devices increases, resulting in what has become known as "NAS sprawl." Jeff Boles, senior analyst and director, validation services at Hopkinton, Mass.-based Taneja Group, discusses how NAS sprawl occurs and how it can be curbed through scale-out NAS, management tools, file virtualization and cloud storage services.
Friday, March 19, 2010
Small- to medium-sized businesses (SMBs) facing e-discovery requirements are beginning to use cloud storage services and other Software-as-a-Service (SaaS) offerings for data compliance needs such as email and collaboration, experts say.
Read More
Thursday, February 25, 2010
The coordination and preparation of a disaster recovery (DR) plan can be a complex operation that spans departments and facilities, and affects every IT system including applications, data storage, and telecommunications. One critical system that you may overlook is your local area network (LAN). Ensuring that your networking infrastructure is up and running so your users have access to critical IT applications and data as quickly as possible is a key component of a comprehensive network disaster recovery plan.
In this LAN DR planning tutorial, you will learn how to recognize the events that should trigger your LAN DR plan, how to prevent and mitigate those networking disasters, and the key elements of a solid LAN DR plan.
Read more
Monday, February 22, 2010
No company wants to get dragged into a lengthy e-discovery process or be penalized for avoiding government compliance regulations. That's why information technology governance, which focuses on the performance and risk management of information technology systems, is so important.
Read more
Tuesday, February 9, 2010
After confronting scalability problems with NFS, the Bioinformatics Core at the University of Alaska Fairbanks adopted an iSCSI storage system and has some lessons to share from its implementation.
Read more
Tuesday, January 19, 2010
While Xsigo Systems Inc. takes a software virtualization approach to I/O virtualization, the majority of vendors in the I/O virtualization space take a different path – implementing on the server a PCI Express (PCIe) bus extender to a "card cage" device that houses standard, off-the-shelf I/O cards and adapters. This method has been adopted by companies such as Aprius Inc., NextIO Inc. and VirtenSys Inc.
Read more
Virtualization and blade server technologies have enabled a generation of consolidated computing devices capable of cramming extraordinary computing power into smaller form factors. But the increased processing power per square inch has brought about a new I/O problem: The pipes can't move data fast enough to keep up with today's processors. To address that problem, new I/O virtualization products and standards are emerging to extend PCI Express (PCIe) pathways to separate I/O devices. This allows multiple physical servers and virtual machines (VMs) to share I/O resources.
Read more
There's a lot of hardware with lots of processing power in today's data center and, as a result, a server I/O bottleneck problem and a glut of cabling to support all of the hardware. I/O virtualization promises to solve the problem, but there's no consensus on the best approach. The PCI Special Interest Group (PCI-SIG) has offered up two I/O virtualization specifications to help standardize I/O virtualization implementations, but vendors for the most part are developing products independent of the standards.
With those kinds of dynamics at play, you'll need to study the landscape closely before wading into an I/O virtualization project. Check out our tutorial to find out about the approaches different vendors are taking to I/O virtualization, the pros and cons of each product, how I/O virtualization differs from Fibre Channel over Ethernet (FCoE) and how the PCI-SIG standards work.
I/O virtualization products, standards reduce network infrastructure headaches
Virtualization and blade server technologies have enabled a generation of consolidated computing devices capable of cramming extraordinary computing power into smaller form factors. But the increased processing power per square inch has brought about a new I/O problem: The pipes can't move data fast enough to keep up with today's processors. To address that problem, new I/O virtualization products and standards are emerging to extend PCI Express (PCIe) pathways to separate I/O devices. This allows multiple physical servers and virtual machines (VMs) to share I/O resources.
Read more
Xsigo's I/O virtualization approach: Software virtualization via I/O Director
Xsigo Systems Inc.'s approach to I/O virtualization is similar to Microsoft Corp.'s and VMware Inc.'s server virtualization methods. Xsigo's VP780 I/O Director consolidates multiprotocol I/O server traffic via high-speed InfiniBand links. This software I/O virtualization approach -- one of two basic methodologies -- heavily favors server virtualization environments. Jon Toor, Xsigo's vice president of marketing, said that approximately 90% of the company's 60 or so production deployments are virtualized server environments, mostly VMware shops.
Read more
Alternative approach to I/O virtualization: PCIe bus extenders
While Xsigo Systems Inc., takes a software virtualization approach to I/O virtualization, the majority of vendors in the I/O virtualization space take a different path – implementing on the server a PCI Express (PCIe) bus extender to a "card cage" device that houses standard, off-the-shelf I/O cards and adapters. This method has been adopted by companies such as Aprius Inc., NextIO Inc. and VirtenSys Inc.
Read more
I/O virtualization and Fibre Channel over Ethernet (FCoE): How do they differ?
Since I/O virtualization and Fibre Channel over Ethernet (FCoE) technology both aim to reduce the physical infrastructure at the server/network interconnect, users may be confused about how Cisco Systems Inc.'s Unified Computing System (UCS) and FCoE support in the company's Nexus switches play in this space. In addition, many IT organizations may find the choice between the Xsigo Systems Inc. and PCIe approaches a difficult one.
Read more
Xsigo Systems Inc.'s approach to I/O virtualization is similar to Microsoft Corp.'s and VMware Inc.'s server virtualization methods. The Xsigo VP780 I/O Director consolidates multiprotocol I/O server traffic via high-speed InfiniBand links. This software I/O virtualization approach – one of two basic methodologies -- heavily favors server virtualization environments. Jon Toor, Xsigo's vice president of marketing, said that approximately 90% of the company's 60 or so production deployments are virtualized server environments, mostly VMware shops.
Read more
Monday, January 18, 2010
There are a growing number of tools and techniques on the market to help IT organizations improve their data storage utilization rates. Arun Taneja, founder and consulting analyst at Hopkinton, Mass.-based Taneja Group, details the storage optimization and capacity reduction products available from a range of enterprise data storage vendors. Read about which tools you might be missing in this FAQ interview.
Read the transcript