Welcome to Issue 5 of Triton News.
Our Hot Topic
this month is about managing the need for 24/7 availability. Our Sales & Marketing Director, Paul Stoker talks exclusively to DB2 expert James Cockayne from EDS about his experience of the 24/7 availability challenge.
Julian Stuhler talks about "Managing the Data Explosion" in his article below
As ever, we love to hear from you so please email us with any comments or suggestions.
If you wish to unsubscribe from this communication please follow the link at the bottom.
|Managing the challenge of 24/7 availabillity
Paul Stoker chats to James Cockayne of EDS about some of the headaches of managing databases in a 24/7 environment.
James gives us his views on key issues for DBA's, the newest active-active availability solutions and how a DBA can ensure they get their all important beauty sleep!
Just how important is it for organisations to ensure they have a robust database availability & disaster recovery solution?
The database is always the lynchpin of an application. Without the database, the application doesn't function, and there's a bunch of angry users wanting to know why (not to mention angry managers seeing money slipping down the drain). More and more there is an expectation that the services we provide will always be available when the customer wants them, and as expectations get higher the tolerance for failure gets lower. It is vital that we not only ensure the customer can access the service they want during the traditional 'working day', but increasingly all the way up to 24x7 availability.
What are some of the issues for database administrators looking after databases in a 24/7 environment?
DB2 UDB on distributed platforms offer a highly reliable database solution, but in the real world many different applications and scripts run against our databases which are hosted on various pieces of complicated hardware and problems will occur. An outage during the day is bad enough, but at least the DBAs and other support teams are at work and ready to respond. When supporting a 24x7 system failure in the middle of the night can be, literally, a nightmare as the DBA has to get connected and conduct a diagnosis and fix as quickly as possible - after all, in a 24x7 system the middle of the night here in the UK is daytime for customers elsewhere in the world. Often an issue will mean calling (and usually waking up) other support personnel from teams such as server support, storage, or networks for diagnosis or resolution - not to mention managers on the escalation list...
Visit our blog for the full interview
|Who's Lookin' At You Kid ?
We've been focussing on security a lot recently, specifically the internal threats which face our organisations. It is vital that we know who is looking at what data and that access and updates can be tracked and attributed to end users.
Who? What? When? Where?
"Anonymous Insiders" can be a major threat to the security of our internal systems. Today's web applications enable thousands of end-users to efficiently connect to application servers. To improve performance, application servers establish a number of persistent pooled connections to corporate databases under the authority of a single generic user - such as WEBADM. In the eyes of the database, all data access and updates are conducted by this userid and the real end-user identity is lost within the connection pool. With no correlation between actual privileged end-users and their database activities, anonymous insider malfeasance can occur without evidence of accountability.
Brother-Watchdog tracks access and updates to corporate data according to your unique requirements. Like a surveillance camera, Brother-Watchdog captures and records all data activities to create immutable audit trails giving you the power to hold ALL privileged users accountable.
|Extend the Value of IBM DB2 Content Manager OnDemand through Online Report Mining |
In order to leverage the most from your investment in IBM DB2 Content Manager OnDemand it is vital that you are able to open up the vast amounts of corporate data locked in stored, static reports. Users need access to dynamic business-driven analysis.
Triton have teamed up with Datawatch
to bring their innovative report mining solution to the UK market.
Monarch Report Mining Server is a web-based report mining solution that turns reports stored in IBM DB2 Content Manager OnDemand into real, actionable data with just the click of the mouse. Thus allowing users to leverage their existing report management and archive systems as a new source of live data.
Managing the data explosion! By Julian Stuhler, Solutions Delivery Director - Triton Consulting, IDUG President & IBM Data Champion
Drowning in data
As Information Technology becomes ever more prevalent in nearly every aspect of our lives, the amount of data generated and stored continues to grow at an astounding rate. According to IBM, worldwide data volumes are currently doubling every two years. IDC estimates that 45GB of data currently exists for each person on the planet: that's a mind-blowing 281 Billion Gigabytes (281 Exabytes) in total. While a mere 5% of that data will end up on Enterprise data servers, it is forecast to grow at a staggering 60% per year, resulting in 14 Exabytes of corporate data by 2011.
This article takes a look at some of the reasons behind this data explosion, and some of the possible effects if the growth is not managed. We'll also examine some of the ways in which these problems can be avoided.
A major trend over the last few years has seen many organisations implementing ERP and CRM solutions. This in turn has caused a dramatic increase in the amount of data we are storing about our customers, prospects, partners and suppliers.
Companies are also investing in ever more sophisticated business intelligence and analytics. In an increasingly competitive marketplace, the ability to base business decisions on solid, reliable and timely management information is becoming a key differentiator, but trend analysis can require very large amounts of historical data to be stored and managed.
The trend towards company consolidation is not a new one, but the current economic situation has inevitably resulted in a significant increase in the number of mergers and acquisitions. This is creating a huge increase in data volumes, with the associated data duplication and application retirement issues. Organisations are faced with not only managing all of their own data, both historic and current, but also this influx of additional data from other parties. Imagine the "data headache" of combining all of the ERP, CRM, Business Intelligence and Analytic systems from different organisations into one manageable enterprise system.
Corporate compliance legislation has had a major effect on how we use, store and maintain our data. The requirements placed on organisations by HIPPA, SOX, Basel ll and others mean that many companies are having to keep hold of more data, and for longer periods. Just as importantly, that retained data rapidly transforms from a corporate asset to a liability once the legal minimum retention period has expired, making it vital that such data can be accurately identified and deleted.
It is vital that organisations adhere to this legislation in order to avoid the cost of court appearances, heavy fines and the resultant damage to the brand.
New capabilities within the databases used to store corporate information are another major driver of data growth. For example, DB2 now supports XML and LOBs ("large objects" such as audio, video, images, etc). The ability to store this kind of data alongside more traditional structured information can be very useful, but can also have a huge impact on the overall size of the database.
Other technical trends that are contributing to database growth include storage of data in Unicode format (which can often expand overall database size by 10%-50% depending on the data), and duplication of databases due to replication requirements and/or backup strategies.
Finally, there's the perennial problem of removing old or obsolete data once it has reached the end of its useful life. Application data archiving is often considered as an optional extra, and even if it is included in the initial project plan it is often the first item to be postponed to a later release.
|Making System Z work harder for you
Data Warehousing & BI on the mainframe
Want to increase your decision making power while leveraging your investment in System Z and lowering total cost of ownership?
Make better decisions faster by utilising your System Z investment for data warehousing and business intelligence. Put more useable information into the hands of more people across your organisation, enabling them to make faster and better informed decisions - with a single version of the truth in real-time - to increase your competitive advantage.
to download the Data Warehousing on System Z white paper.
|Join the FREE webinar
We invite you to join our free webinar being run jointly with IBM & xkoto to discuss high availability technologies for DB2.
- Hear from industry experts about high availability technologies, their capabilities & limitations
- Learn how you can implement active-active databases to protect critical applications from unplanned outages and eliminate the need for scheduled maintenance windows
- Hear how organisations like yours have benefitted from implementing database virtualization technology
Our co-presenters will be:
David Tung - Systems Engineer & GRIDSCALE Expert - xkoto
Senior DB2 Evangellist - IBM