Evangelizing Mainframe
Print Email

IBM DB2 Analytics Accelerator 101

The IBM DB2 Analytics Accelerator is an appliance add-on to your mainframe. So, in effect, it’s a separate box that connects to the mainframe.

The Analytics Accelerator, along with DB2 for z/OS, make up a self-managing, hybrid workload optimized database management system that runs query workloads in the most efficient way. What that means is that queries are executed in the optimal environment for the greatest performance and cost efficiency.

This hybrid computing platform on zEnterprise:

  1. Supports transaction processing and analytics workloads concurrently, efficiently and cost-effectively
  2. Delivers industry-leading performance for mixed workloads
Because the Analytics Accelerator is a separate box, it doesn’t use mainframe MIPS. In fact, it runs on PureData for Analytics powered by Netezza technology (IBM bought Netezza, the company, in 2010). The advantages of this setup include:

  1. Unprecedented response times for ‘right-time’ analysis
  2. Complex queries run in seconds rather than hours
  3. Transparent to the application
  4. Inherits all z Systems DB2 attributes
  5. There’s no need to create or maintain indices
  6. It eliminates query tuning
  7. There’s fast deployment and time-to-value
The tables need to be defined and deployed to Analytics Accelerator before data is loaded and queries sent to it for processing. Definition means identifying tables for which queries need to be accelerated. Deployment is making tables known to DB2, i.e., storing table metadata in the DB2 and PureData catalog. IBM DB2 Analytics Accelerator Studio, which has a GUI, guides you through the process of defining and deploying tables, as well as invoking other administrative tasks.

Stored procedures implement and execute various administrative operations such as table deployment, load and update, and serve as the primary administrative interface to Analytics Accelerator from the outside world including IBM DB2 Analytics Accelerator Studio. Stored procedures can be run from the command line or embedded in custom applications. The stored procedures provide functions that are related to tables and accelerators. All stored procedures commit the transactions that were triggered by the calling applications.

But the tool isn’t just for DB2. Users want to learn more about their investment in the Analytics Accelerator and maximize its use in their environment. These customers are looking at creative ways to exploit it for IMS, VSAM, SMF data and non-z/OS data.

There are three different areas where tools can provide value. They are:

  1. Assessment: Do I have a workload that would benefit from Analytics Accelerator?
  2. Optimization: Can I optimize the workload to take advantage of Analytics Accelerator?
  3. Administration: Can I manage Analytics Accelerator more effectively?
How does data get loaded into Analytics Accelerator? There’s the standard DB2 UNLOAD utility that extracts data. It places tables in read-only mode until Analytics Accelerator load is done (when using the option for transactional consistent data). Analytics Accelerator reflects DB2 data at the point in time the load was initiated. You could, optionally, use Change Data Capture for near-real-time replication to Analytics Accelerator

The new IBM DB2 Analytics Accelerator Loader V1.1 provides group consistent load and external ‘dual’ load. It’s built for performance and usability. The Loader populates data in Analytics Accelerator. It can load DB2 and non-DB2 data. It loads in parallel to avoid application downtime. And it can load to an historical point-in-time.

The Analytics Accelerator Loader can load data from a file in one of two ways: dual external load and Analytics Accelerator only. With dual external load, it loads data into both DB2 and Analytics Accelerator in parallel. With Analytics Accelerator only, it loads directly into Analytics Accelerator (no load in DB2).

The user is responsible for building the load file. Extracted data can come from various sources, e.g., IMS, VSAM, Oracle, etc. The file must be compatible for input into the DB2 LOAD utility. Field specification must describe the input data format. This must be compatible with the DB2 LOAD utility. Null-able columns or ‘not null with default’ can be missing from input data. Defaults will be supplied by the Analytics Accelerator Loader.

Note: The information in this blog is taken mainly from a presentation recently given by Rocket Software’s Wayne Morton to the Virtual IMS user group.



Trevor Eddolls is CEO at iTech-Ed Ltd, an IT consultancy. A popular speaker and blogger, he currently chairs the Virtual IMS and Virtual CICS user groups. He’s editorial director for the Arcati Mainframe Yearbook, and for many years edited Xephon’s Update publications.

Posted: 4/14/2015 12:30:45 AM by Trevor Eddolls | with 3 comments

Print Email

Please sign in to comment.

Sign In


Comments
Thanks for the article on IBM DB2 Analytics Accelerator.
How feasible is it to store duplicate data into both DB2 and Analytics Accelerator. In today's financial scenario would it not be too expensive to invest in Storage capacity, when companies are looking to optimize Storage capacities for primary business data.
Posted: 4/20/2015 8:42:23 AM by VB_Says
You can load non-DB2 data into IDAA via the Loader or with native Netezza nzload. The input format for loads is delimited, which means that the data could originate from any non-DB2 source.

As far as access through DB2 goes, it depends on how the data was loaded. If the data was loaded into both DB2 and IDAA, you basically access it as if it were a regular DB2 table and let DB2 decide where to run.

On the other hand, if the data was loaded only into the IDAA (which makes more sense), you can still get to the data via DB2...you just need to set the special register SET CURRENT QUERY ACCELERATION ALL, which ensures that the queries will be executed on IDAA.

One note....at this time, DB2 does not support joining DB2 and IDAA tables together, so if you are running joins, you need to make sure all of the tables are in IDAA.
[Thanks to SQData's Scott Quillicy for the details here.]
Posted: 4/17/2015 9:01:36 AM by Trevor Eddolls
Interesting article. I was not aware you could load non-DB2 data. So how do you access this non-DB2 data with a DB2 IDAA system?
Posted: 4/16/2015 11:02:16 PM by Troy Coleman