Sectors » Biosciences

Features

Hadoop, Triple Stores, and the Semantic Data Lake

May 26, 2015 |

Hadoop-based data lakes are springing up all over the place as organizations seek low-cost repositories for storing huge mounds of semi-structured data. But when it comes to analyzing that data, some organizations are finding the going tougher than expected. One solution to the dilemma may be found in Hadoop-resident graph databases and the notion of the semantic data lake. Despite their growing popularity, data lakes have taken a bit of heat lately as analyst firms like Gartner call into question Read more…

Peering Into Computing’s Exascale Future with the IEEE

May 6, 2015 |

Computer scientists worried about the end of computing as we know it have been banging heads for several years looking for ways to return to the historical exponential scaling of computer performance. What is needed, say the proponents of an initiative called “Rebooting Computing,” is nothing less than a radical rethinking of how computers are designed in order to achieve the trifecta of greater performance, lower power consumption and “baking security in from the start.” Thomas Conte, the Georgia Tech Read more…

Saving Children’s Lives with Big Genomics Data

Apr 16, 2015 |

It’s estimated than one in 30 children is afflicted by one of 8,000 or so genetic diseases that we have discovered. Thanks to progress in the world of genomic testing—including faster genomic sequencers and better algorithms for making more accurate diagnoses—doctors are making progress in identifying genetic diseases and getting children the treatment they need. While the medical community has been studying genetic diseases for decades, its ability to correctly diagnose genetic diseases and prescribe a suitable course of treatment Read more…

Peek Inside a Hybrid Cloud for Big Genomics Data

Mar 30, 2015 |

Genomics, or the study of human genes, holds an enormous promise to improve the health of people. However, at 80GB to 250GB per sequenced genome, the sheer size of genomics data poses a considerable storage and processing challenge. To keep costs in check, Inova Translational Medical Institute (ITMI) adopted a hybrid cloud architecture for its ongoing genomics studies. ITMI is a branch of the non-profit Inova hospital network in the Washington D.C. area that received 150 million in funding by Read more…

Leverage Big Data Cross-Industry Panel: Video Now Available

Mar 27, 2015 |

  Big data means different things to different people. To some, massive data is a challenge to be overcome, while for others it’s an opportunity to seize. At Tabor Communications’ recent Leverage Big Data event, experts from different industries came together to compare their big data notes. As a professor of astrophysics and computational science at George Mason University, Kirk Borne tracks things moving through the universe over space and time. But that broad description could refer to just about Read more…

News In Brief

Data Rx for Healthcare: Security, Real-Time Analytics

May 27, 2015 |

Life sciences and the health care sectors have been steadily embracing big data analytics as an engine of innovation and a means of reducing costs by, among other things, intelligently allocating resources where they will do the most public good. While there is plenty of upside to applying data analytics to life sciences and healthcare, a new study identifies several key challenges, including data governance, harnessing real-time analytics and the need for an overarching big data analytics strategy. “As big Read more…

Microsoft Scales Data Lake into Exabyte Territory

Apr 29, 2015 |

Microsoft today announced its Azure Data Lake, an HDFS-compatible data repository designed to store vast amounts of structured and unstructured data for customers who want to analyze or explore it later with their choice of tools. The company also announced a cloud-based data warehouse and a new graph API for MS Office. Azure Data Lake service is “a nearly infinite data repository that supports petabyte-size files and all types of data,” says Scott Guthrie, Microsoft’s executive vice president of the Read more…

U.S. Names First Chief Data Scientist

Feb 24, 2015 |

An industry veteran and college math professor who is partially credited with coining the title “data scientist” has been named the nation’s first chief data scientist. The White House announced the appointment of DJ Patil to the new post last week. Patil also will serve as the Obama administration’s deputy chief technology officer for data policy, the White House said. Patil most recently served as a vice president at RelateIQ, a customer relationship management specialist acquired by Salesforce in July Read more…

Connecting the Dots on Dark Data

Feb 17, 2015 |

It’s estimated that 90 percent of the data in the average enterprise is “dark data”–that is, data that isn’t readily available for analytics—leaving only about 10 percent data exposed for doing analytics. While the inclusion of pertinent metadata and master data management (MDM) structure is the long term answer to the dark data problem, enterprises need a way to get value out of that data now. The dark data problem runs deep in many organizations, especially those that have implemented Read more…

Data Companies Work With Citizen Scientists on Climate

Dec 11, 2014 |

Storage vendor EMC Corp. is joining forces with big data and cloud specialist Pivotal and the EarthWatch Institute in an effort to apply analytical tools to the study the impact of climate change. The partners along with Schoodic Institute at Acadia National Park will study the interactions between nature and climate as part of a broader effort to promote citizen science using big data lakes, analytical tools and visualizations, the partners said. The “Big Data vs. Climate Change” initiative was Read more…

This Just In

Zettaset Boosts Big Data Protection for IBM Power Systems servers with Enterprise-Class Encryption for Hadoop and NoSQL Databases

May 20, 2015 |

MOUNTAIN VIEW, Calif., May 20, – Zettaset, a leader in Big Data security, today announced it has successfully completed integration of Zettaset’s Big Data Encryption software solution, BDEncrypt™, with IBM’s Power Systems scale-out servers running Linux. IBM POWER8 customers seeking an encryption solution which is optimized for Big Data environments and distributed computing architectures including Hadoop and NoSQL can now utilize Zettaset BDEncrypt.  BDEncrypt with IBM Power Systems is designed to deliver a high level of compute performance and solution Read more…

PSC to Provide HPC Resources for Pitt Big Data Project

Oct 9, 2014 |

Oct. 9 — The National Institutes of Health has awarded the University of Pittsburgh an $11 million, four-year grant to lead a Big Data to Knowledge Center of Excellence, an initiative that will help scientists capitalize more fully on large amounts of available data and to make data science a more prominent component of biomedical research. The Pittsburgh Supercomputing Center will provide HPC resources and expertise to support the effort. Much of science focuses on understanding the “why” or “how” Read more…