Translate

Monday, October 29, 2012

Team Work..!


Friday, September 28, 2012


Bill Inmon vs. Ralph Kimball
 
Bill InmonBill Inmon's paradigm: Data warehouse is one part of the overall business intelligence system. An enterprise has one data warehouse, and data marts source their information from the data warehouse. In the data warehouse, information is stored in 3rd normal form.  In the data warehousing field, we often hear about discussions on where a person / organization's philosophy falls into Bill Inmon's camp or into Ralph Kimball's camp. We describe below the difference between the two.

Lauran PaineThere is no right or wrong between these two ideas, as they represent different data warehousing philosophies. In reality, the data warehouse in most enterprises are closer to Ralph Kimball's idea. This is because most data warehouses started out as a departmental effort, and hence they originated as a data mart. Only when more data marts are built later do they evolveRalph Kimball's paradigm: Data warehouse is the conglomerate of all data marts within the enterprise. Information is always stored in the dimensional model.
 into a data warehouse. 

Source:http://www.1keydata.com/datawarehousing/inmon-kimball.html



Ethical hacker?

An ethical hacker is a computer and network expert who attacks a security system on behalf of its owners, seeking vulnerabilities that a malicious hacker could exploit. To test a security system, ethical hackers use the same methods as their less principled counterparts, but report problems instead of taking advantage of them. Ethical hacking is also known as penetration testing, intrusion testing and red teaming. An ethical hacker is sometimes called a white hat, a term that comes from old Western movies, where the "good guy" wore a white hat and the "bad guy" wore a black hat. 

One of the first examples of ethical hackers at work was in the 1970s, when the United States government used groups of experts called red teams to hack its own computer systems. According to Ed Skoudis, Vice President of Security Strategy for Predictive Systems' Global Integrity consulting practice, ethical hacking has continued to grow in an otherwise lackluster IT industry, and is becoming increasingly common outside the government and technology sectors where it began. Many large companies, such as IBM, maintain employee teams of ethical hackers.
In a similar but distinct category, a hacktivist is more of a vigilante: detecting, sometimes reporting (and sometimes exploiting) security vulnerabilities as a form of social activism.

 

Data Vault Modeling

 Data Vault Modeling is a database modeling method that is designed to provide historical storage of data coming in from multiple operational systems. It is also a method of looking at historical data that, apart from the modeling aspect, deals with issues such as auditing, tracing of data, loading speed and resilience to change.

Data Vault Modeling focuses on several things. First, it emphasizes the need to trace of where all the data in the database came from. Second, it makes no distinction between good and bad data ("bad" meaning not conforming to business rules),[1] leading to "a single version of the facts" versus "a single version of the truth",[2] also expressed by Dan Linstedt as "all the data, all of the time". Third, the modeling method is designed to be resilient to change in the business environment where the data being stored is coming from, by explicitly separating structural information from descriptive attributes.[3] Finally, Data Vault is designed to enable parallel loading as much as possible,[4] so that you can scale out for very large implementations.
An alternative (and seldom used) name for the method is "Common Foundational Integration Modelling Architecture."[5]


Basic notions

Data Vault attempts to solve the problem of dealing with change in the environment by separating the business keys (that do not mutate as often, because they uniquely identify a business entity) and the associations between those business keys, from the descriptive attributes of those keys.
The business keys and their associations are structural attributes, forming the skeleton of the data model. The Data Vault method has as one of its main axioms that real business keys only change when the business changes and are therefore the most stable elements from which to derive the structure of a historical database. If you use these keys as the backbone of a Data Warehouse, you can organize the rest of the data around them. This means that choosing the correct keys for the Hubs is of prime importance for the stability of your model.[13] The keys are stored in tables with a few constraints on the structure. These key-tables are called Hubs.

Hubs

Hubs contain a list of unique business keys with low propensity to change. Hubs also contain a surrogate key for each Hub item and metadata describing the origin of the business key. The descriptive attributes for the information on the Hub (such as the description for the key, possibly in multiple languages) are stored in structures called Satellite tables which will be discussed below.
The Hub contains at least the following fields:[14]
  • a surrogate key, used to connect the other structures to this table.
  • a business key, the driver for this hub. The business key can consist of multiple fields.
  • the record source, can be used to see where the business keys come from and if the primary loading system has all of the keys available in other systems as well.
  • optionally, you can also have metadata fields with information about manual updates (user/time) and the extraction date.
A Hub is not allowed to contain multiple business keys, except when two systems deliver the same business key but with collisions that have different meanings.
Hubs should normally have at least one satellite.[14]

Hub example

This is an example for a Hub-table containing Cars, surprisingly called "Car" (H_CAR). The driving key is Vehicle Identification Number.

 

Tuesday, September 25, 2012

How Quantum Computers Work

The massive amount of processing power generated by computer manufacturers has not yet been able to quench our thirst for speed and computing capacity. In 1947, American computer engineer Howard Aiken said that just six electronic digital computers would satisfy the computing needs of the United States. Others have made similar errant predictions about the amount of computing power that would support our growing technological needs. Of course, Aiken didn't count on the large amounts of data generated by scientific research, the proliferation of personal computers or the emergence of the Internet, which have only fueled our need for more, more and more computing power.

Will we ever have the amount of computing power we need or want? If, as Moore's Law states, the number of transistors on a microprocessor continues to double every 18 months, the year 2020 or 2030 will find the circuits on a microprocessor measured on an atomic scale. And the logical next step will be to create quantum computers, which will harness the power of atoms and molecules to perform memory and processing tasks. Quantum computers have the potential to perform certain calculations significantly faster than any silicon-based computer.
Scientists have already built basic quantum computers that can perform certain calculations; but a practical quantum computer is still years away. In this article, you'll learn what a quantum computer is and just what it'll be used for in the next era of computing.
You don't have to go back too far to find the origins of quantum computing. While computers have been around for the majority of the 20th century, quantum computing was first theorized less than 30 years ago, by a physicist at the Argonne National Laboratory. Paul Benioff is credited with first applying quantum theory to computers in 1981. Benioff theorized about creating a quantum Turing machine. Most digital computers, like the one you are using to read this article, are based on the Turing Theory. Learn what this is in the next section.

source:http://www.howstuffworks.com/quantum-computer.htm

How users see the programmers :)



Wednesday, August 22, 2012


Data Profiling

Data profiling, also called data archeology, is the statistical analysis and assessment of the quality of data values within a data set for consistency, uniqueness and logic.  

The insight gained by data profiling can be used to determine how difficult it will be to use existing data for other purposes.  It can also be used to provide metrics to assess data quality and determine whether or not metadata accurately describes the actual values in the source data. The data profiling process cannot identify inaccurate data; it can only identify  business rules violations and anomalies.
Profiling tools evaluate the actual content, structure and quality of the data by exploring relationships that exist between value collections both within and across data sets. For example, by examining the frequency distribution of different values for each column in a table, an analyst can gain insight into the type and use of each column. Cross-column analysis can be used to expose embedded value dependencies and inter-table analysis allows the analyst to discover overlapping value sets that represent foreign key relationships between entities.  

Source:http://searchdatamanagement.techtarget.com/definition/data-profiling

Tuesday, July 31, 2012


Metrics that Matter - Key Performance Indicators (KPIs)

  • The BI Metrics Framework is a reference document put together by the MAIS BI Team to help facilitate a metrics discussion in a organization.
The biggest challenge for most units is not the technology or the data but determining what metric's matter (what are the appropriate KPI?). Taking key ratios from various reports that have been used in a organziation over time and displaying them graphically is almost certainly not going to result in the correct Scorecard for your organization. To identify the correct KPI, strategic business discussions need to take place with leadership from across the organization. This will involve assessing the organization's high-level strategy, short and long-term goals, and business drivers impacting the execution of the strategy. See Strategy Mapping [1]
  • "Basic Metric Framework Definitions"
    • Metric is a measure of something.[2]
    • Performance Metric is a measure of some activity related to a company's business performance.
    • Key Performance Indicator (KPI) is a measure of something that is strategically important to the business. [3]
    • Balanced Scorecard [4]. Examples of scorecards used at UM UM Scorecards
    • Dashboards [5]. Examples of Dashboards used at UM ... UM Dashboards


You can’t improve what you can’t (or don’t) measure

You could measure a large number of things in your Unit. You could, in fact, spend more time measuring things than doing the work!
  • Focus on the major processes your Unit can control or influence that link to your units’ strategy, priority or values statements (see Strategy Mapping).
  • Look for industry/University best practices metrics that may apply.
  • Ask yourself questions. How can we do this quicker, How can we save money, How can we reduce errors, etc…
  • The Goal Question Metric (GQM) approach
  • Include others in developing your metrics. Ask yourself and others ... what decisions will you be able to make with this data? How does it support the business?

Select Appropriate Metrics

After you have identified potential metrics you then need to select the metrics that you will begin to generate and report on.
  • Evaluate metrics and select those for further action by using a Prioritization Matrix.
  • Focus on the major processes your team/unit can control or influence.
  • Don’t take on more metrics than your team can handle. Remember you can always come back to them.
For each possible metric, answer the following questions about the data:
  • Is it available already?
  • How frequently is it available?
  • Is it reliable and accurate?
  • How/who collects it?

Develop a Metric Prioritization Matrix

Metric Prioritization Matrix is used to achieve consensus about a potential metric. The Matrix helps you rank metrics by a particular criterion that is important to your team/unit. Then you can more clearly see which metrics are the most important to work on solving first.

Generate Metrics

Once you have identified and selected which metrics you will focus on, you now need to put together a plan for creating and generating your metrics.
  • Identify who is responsible for what?
  • What’s the objective of the metric?
  • How do we calculate the metric?
  • Where is the data coming from?
  • What type of reports and dashboards will we generate?

Report & Monitoring KPIs

After you have collected all of the required information, you are now ready to start building your reports, dashboards and scorecards. Your team needs to agree how to display your metrics in a simple combined way.

Lessons Learned and Ideas to Facilitate the Development of KPIs

1. Be prepared for the metric's mountain, there will always be an overwhelming volume of potential metrics, but your job is to identify the few that are important to your business processes that impact unit strategy. (Develop a Metrics Definition Template)
2. Rapid Prototyping of reports/dashboards/Scorecards and reviewing regularly with leadership saves time, money and credibility
3. Document all data sources, queries, and methodologies used to generate reports and dashboards
4. Establish a metrics review committee
5. Agree on definitions, calculations and data sources for all metrics used (whether they are performance metrics, or KPIs)
6. Regulary review existing metrics to ensure they are still worth measuring and reporting on

source:http://webservices.itcs.umich.edu/mediawiki

Thursday, July 19, 2012


Data Warehouse Architectures

Data warehouses and their architectures vary depending upon the specifics of an organization's situation. Three common architectures are:
  • Data Warehouse Architecture (Basic)
  • Data Warehouse Architecture (with a Staging Area)
  • Data Warehouse Architecture (with a Staging Area and Data Marts)

Data Warehouse Architecture (Basic)

Figure 1-2 shows a simple architecture for a data warehouse. End users directly access data derived from several source systems through the data warehouse.

Figure 1-2 Architecture of a Data Warehouse

Text description of dwhsg013.gif follows


In Figure 1-2, the metadata and raw data of a traditional OLTP system is present, as is an additional type of data, summary data. Summaries are very valuable in data warehouses because they pre-compute long operations in advance. For example, a typical data warehouse query is to retrieve something like August sales. A summary in Oracle is called a materialized view.

Data Warehouse Architecture (with a Staging Area)

In Figure 1-2, you need to clean and process your operational data before putting it into the warehouse. You can do this programmatically, although most data warehouses use a staging area instead. A staging area simplifies building summaries and general warehouse management. Figure 1-3 illustrates this typical architecture.

Figure 1-3 Architecture of a Data Warehouse with a Staging Area

Text description of dwhsg015.gif follows


Data Warehouse Architecture (with a Staging Area and Data Marts)

Although the architecture in Figure 1-3 is quite common, you may want to customize your warehouse's architecture for different groups within your organization. You can do this by adding data marts, which are systems designed for a particular line of business. Figure 1-4 illustrates an example where purchasing, sales, and inventories are separated. In this example, a financial analyst might want to analyze historical data for purchases and sales.

Figure 1-4 Architecture of a Data Warehouse with a Staging Area and Data Marts

Text description of dwhsg064.gif follows


Source: http://docs.oracle.com/

Real-Time Business Intelligence?
In today’s competitive environment with high consumer expectation, decisions that are based on the most current data available will improve customer relationships, increase revenue, and maximize operational efficiencies. The speed of today’s processing systems has moved classical data warehousing into the realm of real-time. The result is real-time business intelligence (RTBI). Business transactions are fed as they occur to a real-time business intelligence system that maintains the current state of the enterprise. The RTBI system not only supports the classical strategic functions of data warehousing for deriving information and knowledge from past enterprise activity, but it also provides real-time tactical support to drive enterprise actions that react to immediate events. As such, it replaces both the classical data warehouse and the enterprise application integration (EAI) functions.

Real-time business intelligence is also known as event-driven business intelligence. In order to react in real-time, a business intelligence system must react to events as they occur – not minutes or hours later. With real-time business intelligence, an enterprise establishes long-term strategies to optimize its operations while at the same time reacting with intelligence to events as they occur.

source:www.gravic.com

Cloud Business Intelligence
Cloud BI is the new way to do Business Intelligence: instead of implementing expensive and complex software on-site, the BI software runs in the Cloud.
It is accessible via any web browser in a so-called software-as-a-service model. There is no need to install software, nor to buy any hardware. And when you’re computing needs grow, the system will automatically assign more resources. This elastic scale is what makes Cloud BI so powerful – you pay for what you use as opposed to always paying to provision for peak load.
With business intelligence software running in the cloud, it is still possible to make comprehensive integration with back-end systems – both within your company and in the cloud. A secure REST-based API is used to enable integrations that work perfectly in a service-oriented architecture. This architecture is ideally suited to integrate data from various sources, and you can mash up internal sales data with public data like economic trends reported by the government.
GoodData provides the most comprehensive Cloud Business Intelligence system on the market today. Build on top of Amazon Web Services, it is almost infinitely scalable. A dedicated user interface team ensure optimal usability. Pre-defined applications accelerate time to market.
source:http://www.gooddata.com

Gartner Magic Quadrant 2012

Positioning Technology Players Within a Specific Market
Who are the competing players in the major technology markets? How are they positioned to help you over the long haul? Gartner Magic Quadrants are a culmination of research in a specific market, giving you a wide-angle view of the relative positions of the market's competitors. By applying a graphical treat­ment and a uniform set of evaluation criteria, a Gartner Magic Quadrant quickly helps you digest how well technology providers are executing against their stated vision.

How Do You Use Magic Quadrants?

Clients use Magic Quadrants as a first step to understanding the technology providers they might consider for a specific investment opportunity.
Keep in mind that focusing on the leaders' quadrant isn't always the best course of action. There are good reasons to consider market challengers. And a niche player may support your needs better than a market leader. It all depends on how the provider aligns with your business goals.

How Do Magic Quadrants Work?

Magic Quadrants provide a graphical competitive positioning of four types of technology providers, where market growth is high and provider differentiation is distinct:
Leaders execute well against their current vision and are well positioned for tomorrow.
Visionaries understand where the market is going or have a vision for changing market rules, but do not yet execute well.
Niche Players focus successfully on a small segment, or are unfocused and do not out-innovate or outperform others.
Challengers execute well today or may dominate a large segment, but do not demonstrate an understanding of market direction.

source:http://www.gartner.com/
BI Trends for 2012
By Jonathan Taylor, Information Developer, Klipfolio.

2012 is shaping up to be another big year for the business intelligence (BI) community. Let's start this year by taking a look ahead to see what trends are going have an impact on the way we do business.
Here's our prediction for the top 5 trends for business intelligence in 2012:

5. Pervasive Business Intelligence and Data Democracy

In a recent survey, Information Week found that for businesses that had adopted a BI tool, only 25% of employees in those businesses had access to that tool. Expect this trend to change over the course of the next few years as organizations begin adoptingcloud and mobile BI dashboards. Certainly in some respects, traditional BI tools have been too bulky and technical for that other 75% of employees to use, rather than being a case of not needing that information (see #3 Self-serve BI).
The common thread throughout the trends listed here is the idea that business intelligence is heading towards simpler, more straightforward methods and tools. The way organizations are using BI tools is changing to provide people throughout the organization with access to KPIs (key performance indicators), not just limiting it to end of month meetings in a boardroom. And that's a good thing, because every job has some degree of decision making associated with it. Consider the warehouse workerwasting time looking for an out of stock item when they could simply check their KPI dashboard and save time and money.

4. Operational or Tactical Business Intelligence

According to Gartner, operational or tactical business intelligence (BI) is growing at a rate of 13% CAGR. That's an impressive number, especially when one considers that analytical or traditional BI tools are growing at a rate of 9% CAGR. This is the result of tandem developments in BI over the last few years. Increased adoption of agile BI tools like cloud and mobile BI encourage individuals to access their KPI dashboards more often. Daily performance metrics — the ones more likely to fluctuate on an hourly basis — are much more relevant to mobile users because they can use that information in a constructive, actionable way. As mentioned above, pervasive BI is also influencing this shift towards operational BI, since analytical BI is more often used by decision makers like executives.
An operational dashboard works much like a car's dashboard. As you drive, you monitor metrics that indicate the current performance of your vehicle and make adjustments accordingly. When the speed-limit changes, you check your speedometer and slow down, or when you see you are out of gas you pull over and fill-up. Likewise, an operational dashboard allows you to make tactical decisions based on current performance, whether it is chasing a red-hot lead or ordering an out-of-stock product.

3. Simplicity and Self-Serve Business Intelligence

Self-serve business intelligence (BI) is about simplicity. In turn, simplicity is about accomplishing your goals without being completely taxed from the effort. The reason self-serve BI is so attractive — and why it is projected to increase throughout 2012 — is that it offers end-users the ability to apply their knowledge of what metrics and ratios matter, and share this expertise by building dashboards without IT support. As more organizations demand self-serve and user-friendly BI tools, vendors will need to continue (or start!) to work on putting non-technical users in the driver's seat of their BI dashboards.
One of the most frequently touted promises of providing simpler, more self-serve oriented BI tools is the dramatic reduction of IT involvement. Whether that goal is attainable or not is irrelevant. The important thing is that in attempting to reach that goal, BI tools will evolve to be more user-friendly and will encourage everyone to start monitoring important metrics and KPIs (see #5. Pervasive BI and Data Democracy).

2. Cloud Hosted Business Intelligence Solutions

Cloud deployed business intelligence (BI) tools are becoming more popular each passing year. As vendors continue to develop better tools, business units forge ahead and capitalize on the simplicity of deploying their BI tools in the cloud. According to a June 2011 survey by the BI Leadership Forum, more than one-third of organizations have a cloud-based BI tool. Even more telling, is that 65% of those organizations are planning to increase their use of cloud BI in the next 12 months. These statistics are a promising indicator of what many analysts and vendors have been predicting for the last few years — increased adoption of cloud-based BI solutions.
Throughout 2012, expect to see the adoption of cloud BI tools to be driven by a number of important factors. First, cloud-based solutions offer the advantage of being relatively simple and convenient to deploy. Second, cloud tools are more easily scalable toprovide access to key performance indicators (KPIs) to everyone in your organization, no matter where they are or what device they are using. This in turn fuels pervasive BI and improves decision making across the organization. Lastly, continually improving security measures will put to rest any reservations businesses have with storing their sensitive data in the cloud.

1. Mobile Business Intelligence and Accessibility

Mobile business intelligence (BI) tools have long been touted as the future of BI and for good reason. In a revealing survey conducted by Gartner, it was found that by 2013 one-third of all BI usage will be on a mobile device, such as a smart-phone or tablet. This is a remarkable figure that points to another year where mobile BI adoption continues to chip away at desktop-only solutions. In part, the changing dynamic of many work-places is fueling adoption of mobile BI (see this Case Study to see just what we are talking about). And this was reflected in a recent survey conducted by the Business Application Research Center, which found that while adoption is currently at about 8%, over the next 12 months 30% of respondents planned to deploy a mobile BI tool. Gartner is reporting mobile BI to be growing at a staggering 40% CAGR.
This trend is easy to tie in with the 4 preceding predictions in this article. Mobile BI is increasingly being seen as a tool for providing on-the-go workers with access to KPIs — imagine a warehouse worker knowing the exact amount of stock, or a sales rep able to monitor territorial performance while away from the office at a tradeshow. Operational BI and mobile tools synergize to create an environment where performance can be managed effectively and from any location. By necessity, organizations deploying and vendors developing mobile BI tools need to consider the implications of working in a remote environment. In other words, a traditional BI dashboard laden with juicy KPIs just doesn't translate from the desktop to a mobile phone in a way that is user friendly.
Source: Klipfolio.com

BUSINESS INTELLIGENCE (BI) TOOLS

A list of Business Intelligence (BI) tools – see the table below. They are widely used for reporting, dashboarding and analysis. The following BI tools, in alphabetical order, were thoroughly examined on 103 criteria and are part of our 100% vendor independent Business Intelligence tool comparison, the BI Tool Survey 2012.
Business Intelligence ToolVersionVendor
BizzScore Suite7.3EFM Software
Board Management IntelligenceToolkit7.1Board International
Business Objects Enterprise XIr4SAP
IBM Cognos Series 1010.1IBM
JasperSoft (open source)4.5 JasperSoft
Microsoft BI tools (integrated BI offering*)2008/2010Microsoft
Microstrategy9Microstrategy
Oracle Enterprise BI Server11g1Oracle
Oracle Hyperion System9Oracle
Pentaho BI suite (open source)Pentaho
QlikView11 QlikTech
SAP NetWeaver BI7.3SAP
SAS Enterprise BI Server9.2SAS Institute
Style Intelligence11 InetSoft
Tableau Software6.1 Tableau Software
WebFocus8Information Builders
Order and download the BI Tool Survey Report [PDF]
Source: http://www.businessintelligencetoolbox.com
What is BI or Business Intelligence ?
The term Business Intelligence (BI) represents the tools and systems that play a key role in the strategic planning process of the corporation. These systems allow a company to gather, store, access and analyze corporate data to aid in decision-making. Generally these systems will illustrate business intelligence in the areas of customer profiling, customer support, market research, market segmentation, product profitability, statistical analysis, and inventory and distribution analysis to name a few.

Most companies collect a large amount of data from their business operations. To keep track of that information, a business and would need to use a wide range of software programs , such as Excel, Access and different database applications for various departments throughout their organization. Using multiple software programs makes it difficult to retrieve information in a timely manner and to perform analysis of the data. (Source: Webopedia)