The Data Virtualization Gold Standard

Robert Eve

Subscribe to Robert Eve: eMailAlertsEmail Alerts
Get Robert Eve: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Top Stories by Robert Eve

The benefits of providing the business side of the house with better access to information assets are many. But so too are the integration challenges that IT must address. Is there a better way? Consider data abstraction, a concept well understood by SOA architects that has recently gained favor with information architects within an overall data virtualization strategy. IT Complexity Reigns Consider the information landscape in a typical large enterprise.  It includes: Large volumes of complex, diverse data A wide landscape of application silos and fit-for-purpose data stores Each data source has its own schema and syntax Few sources are structured properly for consumption by other applications Many sources are incomplete, duplicated - or both And new sources, middleware and consumers are added at a relentless pace. Little wonder that IT has a huge backlog. Data Ab... (more)

What Is Your Strategy for Data Virtualization?

Data virtualization is a data integration approach used by innovative organizations to achieve business agility. Do you want to achieve similar gains, but are unsure where to apply data virtualization? There is no need to guess.  Follow the leaders down one or more of the five paths to data virtualization success. Five Popular Data Virtualization Usage Patterns Data virtualization is a versatile data integration solution that can be deployed to solve a wide range of data integration challenges.  Based on nearly ten years of successful implementations, several common usage patterns... (more)

Roadmap for Data Virtualization

It is clear that information management is becoming more difficult. Business users today are far more demanding as competition and cost-cutting drive IT requirements, and information savvy business staff members deploy more "do-it-yourself" capabilities such as those described in Data Virtualization at Pfizer: A Case Study. Information overload due to exponential volume growth and omnipresent delivery on our desktops, phones and tablets, is something we all feel at a personal level. Further, the IT environment keeps getting more complex as add new big data sources, such as Hadoo... (more)

Will IT Share the Fate of the Titanic?

On April 12, 1912, the maiden voyage of the seemingly unsinkable RMS Titanic ended in disaster. One hundred years later to the day, is IT is on course for a similar collision, with similar catastrophic fate? An Increasingly Icy Relationship Between Business and IT Business dissatisfaction with IT is well-chronicled by TDWI and others.  IT responsiveness surveys1 show an average time to add a data source of nearly eight weeks, with another seven weeks added-on to create a new report or dashboard. TDWI research on the top five drivers for self-service BI2 show desultory results o... (more)

When Should We Use Data Virtualization?

People often ask me, “When is it appropriate to use data virtualization versus a more traditional form of data integration (e.g., consolidation using ETL)?” My answer often surprises them because this is not an either-or situation: data virtualization should be a ubiquitous layer in your infrastructure that all data consumers use to access enterprise data. Data Virtualization Focuses on the Data Consumer Whether the data underneath the virtualization layer is pre-consolidated (using ETL), pre-materialized (as a cache), or exists only in the original transactional and operational sy... (more)