Sectors » Government

Features

DDN Tackles Enterprise Storage Needs as ‘Wolfcreek’ Looms

Jun 30, 2015 |

When it comes to keeping supercomputers fed with data, there are few storage makers that can keep up with Data Direct Networks. But increasingly, DDN is feeling pressure from enterprises that are struggling to keep up with the ongoing data explosion and mixed I/O workloads. That’s where DDN’s forthcoming high-end storage array for the broader enterprise market, codenamed “Wolfcreek,” comes into play. Wolfcreek is DDN‘s next generation converged architecture for enterprise customers. The system borrows technology from DDN’s SFA12k line Read more…

Future of Connected Cities on Display in L.A.

Jun 26, 2015 |

Cities are sitting on mounds of data generated by everything from parking meters to traffic lights to water meters, but they’re having difficulty harnessing that data effectively. As the big data and Internet of Things (IoT) trends evolve, we’ll see cities competing with one another to use data to serve citizens, and that’s where vendors like Los Angeles-based Civic Resource Group International are hoping to step in. Yesterday in downtown LA, CRGI showcased what the future of a connected city Read more…

How Apache Spark Is Helping IT Asset Management

Jun 24, 2015 |

There’s been a lot of energy focused on how big data technology can improve the sales, marketing, service, and support departments of corporations. Tools like Hadoop, Spark, and NoSQL databases are changing the rules for how work gets done, and it’s very exciting. But big data tech is slowly creeping into the IT department itself, which might need it most of all. While the typical IT department may be familiar with big data concepts and technologies, they’re typically working with Read more…

A Tournament of Machine Learning Champions

Jun 16, 2015 |

Behind today’s powerful predictive applications are machine learning models that identify patterns in the data. But getting these models trained and tuned is not an easy process. Analytics giant SAS thinks it has a solution with a new offering called Factory Miner that allows you to run tournaments to pick the best machine learning model for a particular data set. Factory Miner lets users build and test a slew of different machine learning models using seven different algorithm families. Testing more Read more…

Five Ways Big Genomic Data Is Making Us Healthier

Jun 3, 2015 |

We’ve come a long ways since the human genome was first decoded 13 years ago. Thanks to continued advances in high performance computing and distributed data analytics, doctors and computer scientists are finding innovative new ways to use genetic data to improve our health. The economics of genetic testing is creating a flood of genomic data. The Human Genome Project took over a decade and cost $3 billion, but today whole-genome tests from Illumina can now be had in a Read more…

News In Brief

Bonneville Power Uses Analytics to Manage Demand

Jun 18, 2015 |

Among the challenges faced by electric utilities, especially during the summer cooling season, is managing peak demand while figuring out how to integrate intermittent power sources into the grid. Increasingly, big data analytics tools are being applied to the emerging smart grid as a way to gauge demand while improving grid reliability and lowering costs. The latest example comes from the Bonneville Power Administration, the U.S. Energy Department agency that oversees power distribution in the Pacific Northwest. AutoGrid Systems, a Read more…

Arcadia Emerges with Visual Analytics Running Directly On Hadoop

Jun 9, 2015 |

Arcadia Data today emerged from stealth mode by announcing $11.5 million in Series A funding and the limited availability of its flagship product: a Web-based visual analytics tool served directly from Hadoop. In Hadoop, most visualization and BI tools work by sending SQL queries to data stored in HDFS and then moving the results over the network to a separate box that drives the analytics and visualizations. While this works, it introduces limitations in the amount of data that can Read more…

IBM, UK to Collaborate on Big Data

Jun 5, 2015 |

The British government and partner IBM are investing £300 million (more than $456 million) to promote big data and cognitive computing research aimed at helping U.K. businesses engage with customers and suppliers. Among IBM’s contributions to the big data project announced this week is a package of technology and “onsite expertise” valued at £200 million, including access to its Watson cognitive computing platform and at least 24 IBM researchers. They will work with U.K. big data scientists. The British government Read more…

Pentaho Eyes Spark to Overcome MapReduce Limitations

May 12, 2015 |

Pentaho today announced it’s supporting Apache Spark with its suite of data analytic tools. While supporting Spark gives Pentaho performance advantages over MapReduce when executing data transformations and running queries within Hadoop, the software company is approaching the in-memory framework for other use cases with a wary eye. Apache Spark will be a supported engine with the next release of Pentaho‘s analytical platform, which is due in mid to late June. Pentaho is eyeing two main use cases with this Read more…

NOAA Launches Big Data Project

May 6, 2015 |

A big data project launched by the U.S. Commerce Department brings together major cloud analytics vendors and the Open Cloud Consortium to develop infrastructure needed to access the more than 20 terabytes of satellite weather data generated each day. The National Oceanic and Atmospheric Administration (NOAA), a Commerce Department agency, currently manages the daily torrent of observational data collected by weather satellites and other ocean- and land-based sensors. Last month, NOAA and Commerce officials announced a research agreement with cloud Read more…

This Just In

Ryft Open API Library Extends Lightning-Fast Batch and Streaming Analysis to Existing Big Data Environments 

May 21, 2015 |

ROCKVILLE, MD., May 21 — Ryft announced today the availability of the Ryft Open API Library, expanding the company’s Ryft ONE high-performance data analysis capabilities into existing big data environments. Developers can now invoke or extend the Ryft Open API to add hardware-driven acceleration to applications that demand real-time intelligence from all types of batch and streaming data. Now, Ryft combines the flexibility of an open, future-proof 1U platform with the speed, simplicity, scalability and security of a purpose-built analytics accelerator for 100X faster enterprise decision-making Read more…

Paxata Parters With Carahsoft

Mar 13, 2015 |

March 13 — Paxata today announced its partnership with Carahsoft Technology Corp., the trusted government IT solutions provider, to bring the award-winning Paxata Adaptive Data Preparation application and platform to government agencies to help them automate, collaborate and dynamically govern the data integration, data quality and enrichment process in a self-service fashion. “Paxata fills a critical technology gap between the industry-leading data management capabilities provided by Cloudera and innovative data visualization functionality pioneered by Tableau,” said Michael Shrader, Vice President, Intelligence and Innovative Read more…

Analytics Hub to “Predict” General Election Outcome

Mar 6, 2015 |

March 6 — BirdSong Analytics (http://www.birdsonganalytics.com/politics/) will provide daily statistics for the key parties on Twitter and Facebook showing who are the winners and losers as the political race heats up. The daily updates will track all major political parties trending their growth (or loss) of followers day by day and highlighting the most popular tweets each day. As the election nears, these tables will help identify which parties are best at engaging on social media. It will show who is Read more…

The White House Names Dr. DJ Patil as the First U.S. Chief Data Scientist

Feb 19, 2015 |

Feb. 19 — Today, I am excited to welcome Dr. DJ Patil as Deputy Chief Technology Officer for Data Policy and Chief Data Scientist here at the White House in the Office of Science and Technology Policy. President Obama has prioritized bringing top technical talent like DJ into the federal government to harness the power of technology and innovation to help government better serve the American people. Across our great nation, we’ve begun to see an acceleration of the power of data Read more…

NSA Releases New Technology for Open Source Community

Nov 26, 2014 |

Nov. 26 — The National Security Agency announced today the public release of its new technology that automates data flows among multiple computer networks, even when data formats and protocols differ. The tool, called “Niagarafiles (Nifi),” could benefit the U.S. private sector in various ways. For example, commercial enterprises could use it to quickly control, manage, and analyze the flow of information from geographically dispersed sites – creating comprehensive situational awareness. The software is “open source,” which means its code Read more…