Storage COVERAGE FROM AROUND THE WEB
Live from Austin, it's an enterprise and consumer tech cast with some of the biggest brains in the cloudy business. Weirding you out today are humble hosts Ed Saipetch and Sarah Vela. This week, Ed and Sarah take over the podcast live at Dell World 2012 while Greg Knieriemen takes a week off. Our special guests this week are Stephen Foskett of Tech Field Day, Stephen Spector, Cloud Evangelist at Dell and Justin Warren, Managing Director at PivotNine.The business case for why IT Executives are making a strategic shift from RAID to Information Dispersal.To improve service reliability, organizations must be able to see and manage all aspects of performance, availability and security.Forrester research that identifies the cost, benefit, flexibility, and risk factor affecting the investment decision for HP 3PAR Storage.This guides describes a new approach to solving the challenges associated with storing large volumes of unstructured data.
SAN JOSE, Calif., Dec. 20, 2012 /PRNewswire/ -- The Twelfth Annual Storage Visions® Conference held at the Riviera Hotel Convention Center in Las Vegas, Nevada, January 6 & 7, 2013, features insightful keynote talks on solid-state digital storage technologies as well as media and entertainment applications of storage. More information is available on the conference at www.storagevisions.com. Early registration closes on December 26, 2012.2013 Storage Visions® Keynote Speakers include Yoshiaki Shishikui, from NHK Research Laboratories to talk about digital storage requirements for 8K X 4K video at the 2012 Summer Olympics as well as another entertainment keynote presentation from Active Storage. Keynote talks from Intel and SanDisk explore important developments fueled by recent developments in solid-state storage. Peter K. Hazen , Marketing Director, Non-Volatile Memory (NVM) Solutions Group at Intel will talk about Enabling Transformational Change in Clients and Datacenters with Solid State Drives. Following is the abstract of his talk: The explosion of mobile computing has created an insatiable demand for instant, affordable access to the world's information. Solid State Drive solutions are fueling the innovation and proliferation of mobile computing by enabling thinner, lighter and more responsive clients with extended battery life. Service providers and enterprises are responding to the increased demand for data by expanding and transforming their datacenters worldwide. The adoption of SSD solutions in datacenters is enabling transformational change through significant improvements in performance, power and cost. SSD solutions enable breakthrough advances in Big Data, Cloud Computing and High Performance Computing as well as a broad range of Enterprise IT applications. A new generation of SSDs are coming to market with new form factors, lower power, enterprise Reliability, Availability and Serviceability (RAS) features, higher and more consistent performance as well as improved Quality-of-Service (QoS) which are accelerating innovation in client and datacenter solutions.
When a flash start-up gets a product for enterprise cloud service providers out the door it needs to become enterprise-like itself. SolidFire has done just that and recruited a trio of blue-chip storage execs.SolidFire makes modular scale-out flash arrays, with quality of service facilities, for cloud service providers. These respond faster to data access requests coming in from accessing servers than disk drive arrays or even flash-accelerated disk drive arrays. They also scale better, with SolidFire's arrays scaling up to 100 nodes.The first shiny bright new exec is RJ Weigel, who becomes the firm's president. He comes from HP where he was VP for Americas Global accounts in the storage organisation, coming into HP via the 3PAR acquisition. At 3PAR he was world-wide VP for sales and field ops. He's also served time in NetApp and Cisco.Another HP-3PAR guy, Tim Pitcher, becomes SolidFire's international VP. He held a similar position at 3PAR.John Hillyard becomes SolidFire's CFO. He also has an HP connection, but not much of one, having been CFO at iSCSI storage startup LeftHand Networks, which was bought by HP. After that he was CFO at DataLogix. We're told: "He has a wealth of experience leading securities offerings, raising venture funding and coordinating merger and acquisitition transactions."
If 2012 will be remembered for anything specific in the web development world, the debate over performance issues with localStorage will surely be high on the list. The debate began with a post by Christian Heilmann entitled, There is no simple solution for localStorage, in which he made several claims about poor localStorage performance and called for changes to the existing API or the development of an alternative API.The problem with the article was that it didn’t have any numbers, so that set off a large number of follow-up articles taking up the task of providing numbers. John Allsopp wrote localStorage perhaps not so harmful, in which he measured the amount of time it took to read and write 10KB using localStorage. I also entered the mix with an my article, In defense of localStorage, in which I compared the read and write performance of localStorage to that of cookies (which have similar performance characteristics due to disk access). We both determined that localStorage didn’t seem slow enough to warrant throwing it away.After catching up with Taras Glek from Mozilla (who wrote his own post on the subject, PSA: DOM localStorage considered harmful), I realized that both John’s benchmark and my own were flawed in their approach due to the way that localStorage actually works.
Wikibon is a professional community solving technology and business problems through an open source sharing of free advisory knowledge.
According to newly released data from IDC, overall worldwide revenue for servers decreased by 4% in Q3 2012. Over the last decade, as server virtualization adoption has increased, we have seen a steady decline in the revenue and number of servers. Some bright spots of growth for server revenue are in the blade server and hyper-scale or "density optimized" servers. As discussed in Wikibon's Software-led infrastructure manifesto, converged infrastructure (which typically uses blade servers) and low-powered servers (which the density optimized servers are driving towards) are the disruptive technologies for the compute layer of infrastructure.
IDC's numbers for the segment as a whole was: 2.9% year-over-year revenue growth, 1.1% unit decline and 91% of all blades are x86-based
Cloud and Big Data are the target use-cases for Hyper-Scale compute solutions. The cloud solutions typically target web-based companies or service providers where the power and cooling of thousands of servers can outweigh many other criteria (Dell PowerEdge C Series as the leading example). The big data offerings are typically a tightly coupled solution of compute with local storage (such as HP's SL family).
Early during my tenure as Cloud Czar, I was proud to have helped NetApp become the first infrastructure vendor to build an expansive enterprise cloud ecosystem of telco and service provider partners. Together, we serve over 77 percent of the Fortune 500 companies’ core applications and NetApp is the storage foundation for data served to more than one billion cloud users. Today, I want to share my enthusiasm for new customer cloud capabilities announced by NetApp and Amazon Web Services (AWS) at the inaugural re:Invent conference. NetApp Private Storage for AWS combines the availability, performance, agility, and control of NetApp’s enterprise storage with the highly reliable, scalable, and low-cost infrastructure of the AWS cloud.For the first time, NetApp and AWS are combining the agility of AWS compute with the advantages of data residing on NetApp’s enterprise storage. Developers are no longer bound by glass ceilings on the compute power at their disposal. IT managers can still deliver high SLAs and regulatory / legal compliance with data under their complete control. Business leaders can freely balance capex and opex investments precisely according market and financial needs rather than legacy technical limitations. This kind of IT flexibility is what separates the winners from losers as IT continues to evolve.
Just think about it… What if you were trapped under something heavy and the mouse was out of your reach? Scary, right? That's exactly why we have these keyboard shortcuts so you can still use Vimeo until the help arrives.Storage QoS is more than just a way to isolate workloads on a shared storage system to guarantee performance levels, it is the key enabler for long-term hybrid architecture viability, John Spiers, Co-Founder and CEO and Kelly Long, Co-Founder and CTO describe why. Topics include hybrid, benefits of storage QoS, storage media proliferation, hybrid storage system challenges, QoS controlled tiering and caching, storage efficiency and affordability.
Big data could create a job bonanza as organisations try to make sense of their huge hoards of information over the next few years.Within three years, 4.4 million IT staff will be working on big data projects, according to predictions from analyst Gartner, with 1.2 million big data experts needed in Western Europe alone.Some of this big data usage will be simply an expansion of existing projects to accommodate even more data — such as the fraud detection work done by banks, or the customer churn analytics done by retailers.But across many industries demand for big data projects — the ability to find nuggets of insight inside huge volumes of structured
Windows Server 2012 File Server Tip: Run the File Services Best Practices Analyzer (BPA)
The content of this site are personal opinions and might not represent the Microsoft Corporation view. The information contained in this blog represents my view on the issues discussed as of the date of publication.Windows Server 2012 includes a built-in mechanism called Best Practices Analyzer (BPA) to check your configuration and make sure everything is set to the proper values. These set of rules, which come in specific sets for each role you install, can be run through Server Manager or also via PowerShell.For the Windows Server 2012 File Services role, the BPA includes a number of rules, including 99 rules for SMB. Here are some of the rules included in the SMB portion of the File Services BPA:You should definitely run the BPA when you have a chance. Here is a series of PowerShell examples on how to do it:Id -- Microsoft/Windows/ADRMS Microsoft/Windows/CertificateServices Microsoft/Windows/ClusterAwareUpdating Microsoft/Windows/DHCPServer Microsoft/Windows/DirectoryServices Microsoft/Windows/DNSServer Microsoft/Windows/FileServices Microsoft/Windows/Hyper-V Microsoft/Windows/LightweightDirectoryServices Microsoft/Windows/NPAS Microsoft/Windows/RemoteAccessServer Microsoft/Windows/TerminalServices Microsoft/Windows/UpdateServices Microsoft/Windows/VolumeActivation Microsoft/Windows/WebServerPS C:\> Invoke-BpaModel Microsoft/Windows/FileServices ModelId : Microsoft/Windows/FileServices SubModelId : Success : True ScanTime : 11/15/2012 10:48:02 PM ScanTimeUtcOffset : -08:00:00 Detail : {FST2-FS1, FST2-FS1} Count Name Group ----- ---- ----- 96 Information {Microsoft.BestPractices.CoreInterface.Result, Microsoft.BestPractices.CoreInterface... 3 Warning {Microsoft.BestPractices.CoreInterface.Result, Microsoft.BestPractices.CoreInterface...
When it comes to the need for a more flexible, highly available server and storage environment, I think one of our customers in France—MPO International, a home entertainment specialist—said it best.Sebastien Chateau-Dutier, Systems and Networks Manager at MPO France explains, “For us, flexibility was central. Before we spoke to 2M-Equation, we thought all storage arrays were equal. Dell Compellent technology is based on intelligence and capability. Beyond just managing data volumes, it’s also a system uniquely designed for dynamic data performance and growth.” Just six months after joining Dell Compellent as General Manager I couldn't agree more. All storage is not created equal. That’s why we've put the performance and scale critical to MPO’s success at the core of the new Dell Compellent Storage Center 6.3, just announced at the Dell Storage Forum Paris. At no cost to current customers, Storage Center 6.3 can boost system performance by up to 2x. And we’re doubling bandwidth with the first storage array to deliver 16Gb Fibre Channel, end-to-end from server to switch.Designed with enterprise and federal customers in mind, here are some key things to know about what Storage Center 6.3 brings to the table:I’m excited to be leading the Dell Compellent business at such an exciting time in the industry, when we can look for ways to bring even greater performance, scale, and innovation to help our customers solve their business and storage challenges. Got more questions? Join the conversation here or follow us @DellCompellent for all of the latest news.
Featured Webcasts
This Week's Issue
Free Print Subscription
SubscribeCurrent Government Issue
In this issue:
Subscribe Now
- Data Center Optimization: Federal agencies must increase server utilization and energy efficiency as they squeeze more computer processing into fewer data centers. We explore how the Army, Homeland Security, Veterans Affairs and others are doing that.
- Future Cities: The world's urban centers are growing, creating a civic management challenge of unprecedented scope and complexity. Our exclusive survey reveals the opportunities and challenges for city planners and municipal IT pros.
- Read the Current Issue










