cioinsight.com
Home > RSS Feeds > Database
  • Tableau announced the release of Tableau Online 9.0, the latest version of the business intelligence company's cloud analytics offering.

  • The NoSQL market is positioned as an alternative to traditional—and most often expensive—relational databases.

  • With more and more businesses tapping into the power of IBM's Watson Analytics cognitive computing solution, Big Blue announced an enterprise version.

  • Teradata enhanced its database to enable users to mix and match row and column technologies without the trade-offs of purely columnar databases.

  • The technology behind Cloud Bigtable is the same one powering Gmail and other Google apps.

  • IBM has expanded its Watson ecosystem with new partnerships, cognitive computing apps and services.

  • Over the past year, the vendors have been able to accelerate data encryption in Cloudera's Hadoop distribution without hurting system performance.

  • Graph database maker Neo Technology has initiated a new partner program around its Neo4j database solution.

  • In terms of e-discovery, the electronic exchange of documents between parties during litigation, an organized database makes all the difference. But no two companies' database needs are alike. Needs can differ based on industry, company size, case flow and more. Without an organized and flexible database, companies are at risk of missing out on cost and time savings, or worse, sharing privileged documents with the opposition. In the most extreme circumstances, this can hurt the outcome of a case altogether, as both parties will shift their focus to arguing over which documents to exchange or reproducing documents, instead of what really matters—winning the case. In the end, it all comes down to planning, according to eWEEK reporting and conversations with kCura, which offers several tips and best practices for building an optimal database for an organization's e-discovery needs. The biggest mistake companies and law firms make is diving into the database without building and implementing a process.

  • There is a stunning amount of big data being transmitted and collected by millions of digital devices every day, but that's just one part of the big data picture—and not even the most important aspect. As Atul Butte of Stanford School of Medicine said, "Hiding within those mounds of data is knowledge that could change the life of a patient, or change the world." In practical terms, it's the people who are actually doing something with large data sets who are shaping new industries and reshaping old ones, from digital filmmaking and satellite imagery to analytics for business, health care and finance. Today, data-intensive industries have emerged with the need for software that can transfer large files and data sets. Luckily for those emerging industries as well as smaller media companies, the era of cloud computing has been a game-changer, enabling shared, vendor-managed infrastructures that have drastically lowered software prices. Software-as-a-service (SaaS) solutions have made it possible for any company anywhere in the world to have access to large file transfer technology. Signiant, a provider of secure and efficient on-premises and SaaS large file transfer solutions for companies of all sizes, has seen it all. Here are 10 examples it has seen of how different industries are moving big data.

  • At the Interop conference, the NBA's CIO provides details on the IT behind North America's professional basketball league.

  • Deep Information Sciences has launched Deep Engine, a plug-and-play storage engine that brings scale and performance to MySQL databases, without the need for recoding or redeploying applications that use MySQL as a data source. What's more, Deep Engine is installed as a simple 10M-byte plug-in, which can run alongside other storage engines or replace them entirely. Deep Engine brings hybrid transactional and analytical processing (HTAP) capabilities to MySQL, making it suitable for big data and other data-intensive projects that require billions of rows of information in both structured and unstructured record formats. Substantial performance improvements and exponentially greater levels of scale are offered, thanks to compression technology, as well as machine learning routines that leverage intelligent heuristics to better organize queries and storage elements. Taken altogether, Deep Engine can potentially delay or prevent the need to transition to more expensive server and database platforms to handle big data or other data-intensive projects.

  • A bug surfaces soon after the release of SQL Server 2014's first service pack, forcing Microsoft to temporarily disable downloads of the update.

  • Mark your calendars. Like Windows XP, and soon Windows Server 2003, Microsoft is sounding the alarm on SQL Server 2005's impending support sunset.

  • IBM announced a slew of new software and services to help foster an ecosystem around its recently announced Internet of Things (IoT) play.

Rocket Fuel