• Technologies
  • Microsoft Azure
  • Google Cloud
  • Azure Cloud
  • Records Management
  • Warehousing and Storage
  • Schedule a Call

OLAP: A Deep Dive into Online Analytical Processing

January 9, 2024

OLAP (Online Analytical Processing), a buzzword in the ever-changing business intelligence landscape, has become a key concept in data analysis and reporting. While companies are looking for advanced data-driven decision-making tools, they also turn to OLAP as an alternative solution for their data usage. Consequently, this paper is going to provide a detailed exploration of OLAP as well as its relevance in empowering businesses’ higher management which includes chief people officers, managing directors and country managers through actionable insights.

Key Characteristics of OLAP

An interactive dimensionality and multidimensional analysis tool called Online Analytical Processing (OLAP) is widely employed today. Unlike Online Transactional Processing (OLTP) which deals with transactions only, OLAP is designed to handle complex queries and reports. In these models, the data is organized into multidimensional structures that facilitate efficient and dynamic modeling.

  • Multidimensionality: The approach adopted by OLAP systems involves organizing information in dimensions and hierarchies thereby creating a multi-dimensional view suitable for various analyses. This enables users to drill down or slice through the data on different levels thus getting deeper insights.
  • Aggregation: Aggregation functionality allows users to roll up or drill down into details at different levels of granularity. Flexibility is inevitable because executives need both comprehensive overviewing and deep insight perspectives.
  • Interactivity:   This feature makes it possible for business executives to do real-time manipulation of the corporation’s primary data applied when making decisions. It is especially useful when managers have to make quick decisions based on multiple scenarios they must go through before making final choices.

OLAP Models

Online Analytical Processing (OLAP) models form the backbone of interactive and multidimensional data analysis. In this section, we delve into the various OLAP models, each offering unique characteristics to cater to the diverse needs of businesses.

MOLAP (Multidimensional OLAP)

The MOLAP model stores data in multidimensional cubes which enable a structured and efficient way of analyzing it. This approach has fast query performance, hence, is ideal for cases where response time is critical.

Key Features

  • Cube Structure: Data is stored in a cube format, facilitating easy navigation.
  • High Performance: MOLAP systems are optimized for fast query retrieval.
  • Examples: Microsoft Analysis Services, IBM Cognos TM1.

ROLAP (Relational OLAP)

Data is saved in relational databases by ROLAP systems which make them more scalable and flexible. In particular, this type of model can be used effectively with large datasets that have complex associations between them.

  • Relational Storage: Data is stored in relational databases, ensuring flexibility.
  • Scalability: ROLAP systems can handle vast amounts of data effectively.
  • Examples: Oracle OLAP, SAP BW.

HOLAP (Hybrid OLAP)

HOLAP introduces a balance between the performance and scalability trade-offs found in MOLAP and ROLAP respectively. The best combination includes the use of multidimensional storage aspects and involvement of relational databases combined together into one approach termed as HOLAP modeling strategy.

  • Hybrid Approach: HOLAP systems leverage both cube and relational storage methods.
  • Optimal Performance: Balances performance considerations for diverse analytical needs.
  • Examples: Microsoft SQL Server Analysis Services.

Understanding the nuances of each OLAP model is crucial for businesses seeking to align their data analysis capabilities with specific requirements and objectives. Whether prioritizing speed, scalability, or a hybrid approach, selecting the right OLAP model is integral to unlocking the full potential of multidimensional data analysis.

OLAP in Data Warehouse Architecture

In a living BI landscape, a strong olap data warehouse architecture is crucial to sound decision making. The center of this architecture is OLAP (Online Analytical Processing), a powerful tool that converts raw data into actional insights.

The Data Warehouse Foundation

As of 2021, the global online analytical processing market was valued at approximately $3.8 billion , with a compound annual growth rate (CAGR) of around 8%. Before getting into OLAP, it’s important to understand what makes up data warehousing. A data warehouse is where all types of organizational information are pooled from different sources together. This brings out a complete structured dataset and constitutes an essential platform for better analysis. In most cases, significant features that define the notion of a data warehouse encompass:

  • Centralized Storage: Data warehouses provide a single, centralized location for storing data. This eliminates data silos, ensuring that all relevant information is accessible from a unified source. This centralized storage is crucial for streamlined analysis for businesses with diverse datasets.
  • Historical Data: Unlike traditional databases focusing on current data, data warehouses store historical data over time. This historical perspective allows businesses to analyze trends, track performance, and make informed decisions based on a comprehensive understanding of their data.

Enhancing Analytical Capabilities

A TDWI survey indicated that over 60% of surveyed companies have implemented OLAP in their data warehousing strategy. After the establishment of the basis for the development of your future dwh you should think about olap technologies usage as they make it possible to realize its potential entirely. Online Analytical Processing serves as an analytic engine enabling interactive dynamic analysis operations on multi-dimensional arrays or cubes stored in compatible database management systems.

  • Cube Creation: OLAP organizes information according to dimensionality structures referred to as cubes. It is a full representation of data involving multiple dimensions and hierarchies. Cube building takes into account the identification of the relevant dimensions to the data which helps in making subtle analysis possible.
  • Integration with ETL Processes: To populate a data warehouse with information and ensure its update, organizations have to use the Extract Transform Load (ETL) process. OLAP is closely tied to these ETL operations so that ever changing warehouse data is always ready for analysis by it. This integration establishes a dynamic relationship between OLAP and the data warehouse, allowing real-time insights.

OLAP Models in Data Warehouse Architecture

Studies by Forrester Research highlight that organizations leveraging OLAP in their data warehousing architecture experience, on average, a 15% improvement in decision-making processes and a 20% reduction in time spent on data analysis. OLAP comes in various models, each with its strengths and use cases. Understanding these models is crucial for optimizing analytical processes within the data warehouse.

  • MOLAP (Multidimensional OLAP): MOLAP systems store data in a multidimensional cube format. This storage structure is highly efficient for quick query performance, making it ideal for scenarios where rapid analysis is paramount.
  • ROLAP (Relational OLAP): ROLAP systems store data in relational databases. This model offers greater scalability and flexibility, making it suitable for large-scale data warehousing scenarios.
  • HOLAP (Hybrid OLAP): HOLAP combines elements of both MOLAP and ROLAP, offering a balanced approach that prioritizes performance and scalability. This model benefits organizations seeking a middle ground between speed and adaptability.

OLAP Analysis Techniques

Among businesses implementing OLAP, the distribution between MOLAP, ROLAP, and HOLAP models is approximately 40%, 35%, and 25% respectively . OLAP’s true strength lies in enabling users to analyze multidimensional data interactively. Several analysis techniques empower users to derive meaningful insights:

  • Slice and Dice: OLAP allows users to “slice” the data by selecting a specific dimension and “dice” it by choosing subsets. This technique provides a granular data view, allowing detailed analysis based on specific parameters.
  • Pivot: The pivot function enables users to rotate the axes of the cube, providing different perspectives of the data. This dynamic feature is valuable for decision-makers who need diverse angles for strategic decision-making.

OLAP Reporting

About 70% of large enterprises were integrating OLAP capabilities with big data analytics solutions to handle the increasing volumes of data. OLAP goes beyond analysis by facilitating the creation of comprehensive OLAP reports. Reporting features are crucial for delivering actionable insights to stakeholders across the organization.

  • Customized Dashboards: OLAP tools enable the creation of customized dashboards that present key performance indicators (KPIs) in a visually appealing and easily understandable format. These dashboards provide a consolidated view of critical metrics, supporting faster decision-making.
  • Ad-hoc Reporting: The flexibility of online analytical processing allows users to generate ad-hoc reports on the fly. This capability is invaluable for scenarios where immediate insights are required, empowering decision-makers with the information they need without delay.

OLAP Data Modeling

The migration to cloud-based data warehousing solutions is on the rise, with over 50% of enterprises planning or already moving their data warehousing infrastructure to the cloud. Data modeling is crucial to leveraging OLAP effectively within the data warehouse architecture. Dimensional modeling, in particular, is pivotal in optimizing OLAP analysis.

  • Star Schema: One prevalent dimensional modeling technique is the star schema. In this model, a central fact table is surrounded by dimension tables. This schema simplifies querying and enhances performance by creating a structure that facilitates efficient data retrieval.
  • Snowflake Schema: A snowflake schema, where dimension tables are normalized, is sometimes employed. While this approach ensures data integrity, it may require more complex queries than the star schema.

OLAP and Big Data

A case study conducted by IBM reported that organizations implementing OLAP solutions experienced, on average, a 20% improvement in return on investment (ROI) within the first year. As organizations grapple with the influx of big data, OLAP systems must adapt to handle vast datasets. The intersection of OLAP and big data opens new possibilities for scalable and high-performance analytics.

  • Scalability: Scalable OLAP solutions accommodate the growing data demands faced by enterprises dealing with massive datasets. This scalability ensures that OLAP remains a reliable tool for businesses seeking to harness the potential of big data.
  • Integration with Advanced Analytics: The integration of OLAP with advanced analytics tools enhances its capability to derive actionable insights from vast datasets. This synergy positions OLAP as a strategic asset for businesses looking to stay ahead in a competitive landscape.

At its core, OLAP data modeling is about organizing data to facilitate intuitive and efficient analysis. Unlike traditional relational databases optimized for transactional processing (OLTP), OLAP data modeling focuses on providing a multidimensional view of the data, allowing users to navigate through various dimensions for a comprehensive understanding.

  • Centralized Metrics: Facts in OLAP data models represent the numerical data or metrics that businesses want to analyze. These could include sales figures, revenue, quantities sold, or any other measurable KPIs central to the organization.
  • Organized Structures: Hierarchies define the relationships within dimensions. For example, a time dimension hierarchy could include levels like year, quarter, month, and day. Hierarchies enhance the ability to drill down or roll up through different levels of granularity.
  • Quantifiable Attributes: Measures are additional quantitative attributes associated with dimensions. They provide further granularity to the analysis. For instance, within the “product” dimension, measures could include a unit price or discount percentage.
  • Centralized Fact Table: The star schema is one of the most widely used OLAP data modeling techniques. In this model, a central fact table is surrounded by dimension tables, forming a star-like structure. This simplifies queries and enhances performance by reducing the number of joins needed.
  • Snowflake Schema: Alternatively, the snowflake schema extends the star schema by normalizing dimension tables. While this maintains data integrity, it can result in more complex queries due to the need for additional joins.
  • Collaborative Approach: Work closely with stakeholders, including higher management, to understand the metrics and dimensions critical for decision-making. This collaboration ensures the OLAP data model is tailored to meet organizational needs.
  • Focus on Relevance: Identify the KPIs that align with organizational goals. This step is crucial for managing directors and executives who require high-level strategic insights.

How can Brickclay Help?

Brickclay, as a leading provider of business intelligence services , is well-positioned to assist organizations in leveraging the power of OLAP for their data analysis and decision-making needs. Here’s how Brickclay can specifically help businesses, addressing the requirements of higher management, chief people officers, managing directors, and country managers:

  • Customized OLAP Solutions: Brickclay specializes in tailoring OLAP solutions to meet the unique needs of businesses. Whether implementing MOLAP, ROLAP, or HOLAP models, the company ensures that the chosen OLAP system aligns seamlessly with the organization’s data structure and analytical requirements.
  • Data Warehousing Expertise: With a deep understanding of data warehousing, Brickclay can assist in designing and implementing robust data warehouse architectures. This includes centralizing data from various sources, establishing efficient ETL processes, and ensuring data integrity – all essential components for effective OLAP analysis.
  • Persona-Centric OLAP Implementation: Brickclay adopts a persona-centric approach to OLAP implementation. The company works closely with higher management, chief people officers, managing directors, and country managers to understand their specific analytical needs. This ensures that the OLAP system is configured to deliver actionable insights tailored to each persona’s requirements.
  • Training and Support: Understanding the importance of user adoption, Brickclay provides comprehensive training sessions for employees at all levels. Whether it’s teaching executives how to perform scenario analyses or guiding HR teams through workforce analytics, the aim is to empower users to make the most of OLAP tools.
  • Dashboard Development: Brickclay excels in creating intuitive and visually appealing dashboards that cater to the diverse needs of higher management, chief people officers, managing directors, and country managers. These dashboards offer quick access to key metrics, supporting informed decision-making.
  • Scalable OLAP Solutions for Growth: Recognizing that businesses evolve, Brickclay ensures that the implemented OLAP solutions are scalable. This scalability is crucial for managing directors overseeing expanding operations and dealing with increasing data volumes.
  • Integration with Advanced Analytics: As the landscape of business intelligence evolves, Brickclay is at the forefront of integrating OLAP with advanced analytics, including AI and machine learning. This ensures businesses can benefit from predictive analytics, gaining insights into future trends.
  • Cloud-Based OLAP Solutions: Embracing the cloud computing trend, Brickclay offers expertise in implementing cloud-based OLAP solutions. This provides increased scalability and cost-efficient alternatives for businesses looking to optimize their infrastructure investments.
  • Continuous Innovation and Future-Proofing: Brickclay is committed to staying ahead of the curve, keeping clients at the forefront of technological advancements in OLAP. The company actively explores emerging trends, such as enhanced natural language processing, to ensure that clients are equipped with the latest tools for data analysis.

Feel free to contact us at Brickclay for unparalleled expertise in business intelligence and personalized solutions tailored to your data needs. Contact our dedicated team today to embark on a journey of insightful decision-making. Your success starts with a conversation – connect with us now.

Like what you see ? Share with a friend.

About brickclay.

Brickclay is a digital solutions provider that empowers businesses with data-driven strategies and innovative solutions. Our team of experts specializes in digital marketing, web design and development, big data and BI. We work with businesses of all sizes and industries to deliver customized, comprehensive solutions that help them achieve their goals.

Recommended Reading

Data reporting

bi and data visualization

Data Reporting and Visualization Influence on Business Intelligence

data driven culture

business intelligence strategy

Crafting a Data Driven Culture: Business Intelligence Strategy and Consulting

BI Performance

bi performance

Connecting Goals to Metrics: The Role of Performance Management in BI

Stay connected.

Get the latest blog posts delivered directly to your inbox.

Follow us for the latest updates

Have any feedback or questions.

NixonData.com

What is OLAP and OLTP, use-cases, comparison, and examples

OLTP (Online Transaction Processing) and OLAP (Online Analytical Processing) are two different types of database systems that are designed to support different types of workloads.

OLTP systems are designed to support transactional workloads, which involve inserting, updating, and deleting small amounts of data in a database. These systems are optimized for fast read and write performance, and are typically used to support real-time operational systems, such as e-commerce websites, ATM machines, and airline reservation systems.

OLAP systems, on the other hand, are designed to support analytical workloads, which involve querying large amounts of data to extract insights and perform data analysis. These systems are optimized for fast query performance and are typically used to support business intelligence and data warehousing applications.

Here are some examples of the use cases for OLTP and OLAP systems:

  • E-commerce websites
  • Banking systems
  • Inventory management systems
  • Customer relationship management (CRM) systems
  • Data warehousing and business intelligence
  • Financial analysis
  • Marketing analysis
  • Supply chain analysis

In summary, OLTP systems are used for transactional processing, while OLAP systems are used for analytical processing. OLTP systems are optimized for fast read and write performance, while OLAP systems are optimized for fast query performance.

Here is a list of some popular OLTP and OLAP tools :

OLTP (Online Transaction Processing) tools:

  • Microsoft SQL Server
  • Oracle (Oracle)

OLAP (Online Analytical Processing) tools:

  • Apache Hive
  • Apache Impala
  • Amazon Redshift
  • Google BigQuery

Note that some database systems, such as Oracle, can be used for both OLTP and OLAP workloads, depending on how they are configured and used.

IEEE Account

  • Change Username/Password
  • Update Address

Purchase Details

  • Payment Options
  • Order History
  • View Purchased Documents

Profile Information

  • Communications Preferences
  • Profession and Education
  • Technical Interests
  • US & Canada: +1 800 678 4333
  • Worldwide: +1 732 981 0060
  • Contact & Support
  • About IEEE Xplore
  • Accessibility
  • Terms of Use
  • Nondiscrimination Policy
  • Privacy & Opting Out of Cookies

A not-for-profit organization, IEEE is the world's largest technical professional organization dedicated to advancing technology for the benefit of humanity. © Copyright 2024 IEEE - All rights reserved. Use of this web site signifies your agreement to the terms and conditions.

default representative image

A case study for data warehousing courseware

  • Graduate Project
  • Biligere Prabhuswamy, Shwetha
  • Mitchell, William J.
  • Computer Science Department
  • California State University, Sacramento
  • Computer Science
  • Online analytical processing
  • Data warehouse
  • Web-based interactive courseware
  • oai:alma.01CALS_USL:11232458540001671
  • http://hdl.handle.net/10211.3/140735

California State University, Sacramento

Items in ScholarWorks are protected by copyright, with all rights reserved, unless otherwise indicated.

This browser is no longer supported.

Upgrade to Microsoft Edge to take advantage of the latest features, security updates, and technical support.

  • Online analytical processing (OLAP)

Online analytical processing (OLAP) is a technology that organizes large business databases and supports complex analysis. It can be used to perform complex analytical queries without negatively affecting transactional systems.

The databases that a business uses to store all its transactions and records are called online transaction processing (OLTP) databases. These databases usually have records that are entered one at a time. Often they contain a great deal of information that is valuable to the organization. The databases that are used for OLTP, however, were not designed for analysis. Therefore, retrieving answers from these databases is costly in terms of time and effort. OLAP systems were designed to help extract this business intelligence information from the data in a highly performant way. This is because OLAP databases are optimized for heavy read, low write workloads.

Diagram that shows the OLAP logical architecture in Azure.

Semantic modeling

A semantic data model is a conceptual model that describes the meaning of the data elements it contains. Organizations often have their own terms for things, sometimes with synonyms, or even different meanings for the same term. For example, an inventory database might track a piece of equipment with an asset ID and a serial number, but a sales database might refer to the serial number as the asset ID. There is no simple way to relate these values without a model that describes the relationship.

Semantic modeling provides a level of abstraction over the database schema, so that users don't need to know the underlying data structures. This makes it easier for end users to query data without performing aggregates and joins over the underlying schema. Also, usually columns are renamed to more user-friendly names, so that the context and meaning of the data are more obvious.

Semantic modeling is predominately used for read-heavy scenarios, such as analytics and business intelligence (OLAP), as opposed to more write-heavy transactional data processing (OLTP). This is mostly due to the nature of a typical semantic layer:

  • Aggregation behaviors are set so that reporting tools display them properly.
  • Business logic and calculations are defined.
  • Time-oriented calculations are included.
  • Data is often integrated from multiple sources.

Traditionally, the semantic layer is placed over a data warehouse for these reasons.

Example diagram of a semantic layer between a data warehouse and a reporting tool

There are two primary types of semantic models:

  • Tabular . Uses relational modeling constructs (model, tables, columns). Internally, metadata is inherited from OLAP modeling constructs (cubes, dimensions, measures). Code and script use OLAP metadata.
  • Multidimensional . Uses traditional OLAP modeling constructs (cubes, dimensions, measures).

Relevant Azure service:

  • Azure Analysis Services

Example use case

An organization has data stored in a large database. It wants to make this data available to business users and customers to create their own reports and do some analysis. One option is just to give those users direct access to the database. However, there are several drawbacks to doing this, including managing security and controlling access. Also, the design of the database, including the names of tables and columns, may be hard for a user to understand. Users would need to know which tables to query, how those tables should be joined, and other business logic that must be applied to get the correct results. Users would also need to know a query language like SQL even to get started. Typically this leads to multiple users reporting the same metrics but with different results.

Another option is to encapsulate all of the information that users need into a semantic model. The semantic model can be more easily queried by users with a reporting tool of their choice. The data provided by the semantic model is pulled from a data warehouse, ensuring that all users see a single version of the truth. The semantic model also provides friendly table and column names, relationships between tables, descriptions, calculations, and row-level security.

Typical traits of semantic modeling

Semantic modeling and analytical processing tends to have the following traits:

Requirement Description
Schema Schema on write, strongly enforced
Uses Transactions No
Locking Strategy None
Updateable No (typically requires recomputing cube)
Appendable No (typically requires recomputing cube)
Workload Heavy reads, read-only
Indexing Multidimensional indexing
Datum size Small to medium sized
Model Multidimensional
Data shape: Cube or star/snowflake schema
Query flexibility Highly flexible
Scale: Large (10s-100s GBs)

When to use this solution

Consider OLAP in the following scenarios:

  • You need to execute complex analytical and ad hoc queries rapidly, without negatively affecting your OLTP systems.
  • You want to provide business users with a simple way to generate reports from your data
  • You want to provide a number of aggregations that will allow users to get fast, consistent results.

OLAP is especially useful for applying aggregate calculations over large amounts of data. OLAP systems are optimized for read-heavy scenarios, such as analytics and business intelligence. OLAP allows users to segment multi-dimensional data into slices that can be viewed in two dimensions (such as a pivot table) or filter the data by specific values. This process is sometimes called "slicing and dicing" the data, and can be done regardless of whether the data is partitioned across several data sources. This helps users to find trends, spot patterns, and explore the data without having to know the details of traditional data analysis.

Semantic models can help business users abstract relationship complexities and make it easier to analyze data quickly.

For all the benefits OLAP systems provide, they do produce a few challenges:

  • Whereas data in OLTP systems is constantly updated through transactions flowing in from various sources, OLAP data stores are typically refreshed at much slower intervals, depending on business needs. This means OLAP systems are better suited for strategic business decisions, rather than immediate responses to changes. Also, some level of data cleansing and orchestration needs to be planned to keep the OLAP data stores up-to-date.
  • Unlike traditional, normalized, relational tables found in OLTP systems, OLAP data models tend to be multidimensional. This makes it difficult or impossible to directly map to entity-relationship or object-oriented models, where each attribute is mapped to one column. Instead, OLAP systems typically use a star or snowflake schema in place of traditional normalization.

OLAP in Azure

In Azure, data held in OLTP systems such as Azure SQL Database is copied into the OLAP system, such as Azure Analysis Services . Data exploration and visualization tools like Power BI , Excel, and third-party options connect to Analysis Services servers and provide users with highly interactive and visually rich insights into the modeled data. The flow of data from OLTP data to OLAP is typically orchestrated using SQL Server Integration Services, which can be executed using Azure Data Factory .

In Azure, all of the following data stores will meet the core requirements for OLAP:

  • SQL Server with Columnstore indexes
  • SQL Server Analysis Services (SSAS)

SQL Server Analysis Services (SSAS) offers OLAP and data mining functionality for business intelligence applications. You can either install SSAS on local servers, or host within a virtual machine in Azure. Azure Analysis Services is a fully managed service that provides the same major features as SSAS. Azure Analysis Services supports connecting to various data sources in the cloud and on-premises in your organization.

Clustered Columnstore indexes are available in SQL Server 2014 and above, as well as Azure SQL Database, and are ideal for OLAP workloads. However, beginning with SQL Server 2016 (including Azure SQL Database), you can take advantage of hybrid transactional/analytical processing (HTAP) through the use of updateable nonclustered columnstore indexes. HTAP enables you to perform OLTP and OLAP processing on the same platform, which removes the need to store multiple copies of your data, and eliminates the need for distinct OLTP and OLAP systems. For more information, see Get started with Columnstore for real-time operational analytics .

Key selection criteria

To narrow the choices, start by answering these questions:

Do you want a managed service rather than managing your own servers?

Do you require secure authentication using Microsoft Entra ID?

Do you want to conduct real-time analytics? If so, narrow your options to those that support real-time analytics.

Real-time analytics in this context applies to a single data source, such as an enterprise resource planning (ERP) application, that will run both an operational and an analytics workload. If you need to integrate data from multiple sources, or require extreme analytics performance by using pre-aggregated data such as cubes, you might still require a separate data warehouse.

Do you need to use pre-aggregated data, for example to provide semantic models that make analytics more business user friendly? If yes, choose an option that supports multidimensional cubes or tabular semantic models.

Providing aggregates can help users consistently calculate data aggregates. Pre-aggregated data can also provide a large performance boost when dealing with several columns across many rows. Data can be pre-aggregated in multidimensional cubes or tabular semantic models.

Do you need to integrate data from several sources, beyond your OLTP data store? If so, consider options that easily integrate multiple data sources.

Capability matrix

The following tables summarize the key differences in capabilities.

General capabilities

Capability Azure Analysis Services SQL Server Analysis Services SQL Server with Columnstore Indexes Azure SQL Database with Columnstore Indexes
Is managed service Yes No No Yes
Supports multidimensional cubes No Yes No No
Supports tabular semantic models Yes Yes No No
Easily integrate multiple data sources Yes Yes No No
Supports real-time analytics No No Yes Yes
Requires process to copy data from sources Yes Yes No No
Microsoft Entra integration Yes No No Yes

[1] Although SQL Server and Azure SQL Database cannot be used to query from and integrate multiple external data sources, you can still build a pipeline that does this for you using SSIS or Azure Data Factory . SQL Server hosted in an Azure VM has additional options, such as linked servers and PolyBase . For more information, see Pipeline orchestration, control flow, and data movement .

[2] Connecting to SQL Server running on an Azure Virtual Machine is not supported using a Microsoft Entra account. Use a domain Active Directory account instead.

Scalability Capabilities

Capability Azure Analysis Services SQL Server Analysis Services SQL Server with Columnstore Indexes Azure SQL Database with Columnstore Indexes
Redundant regional servers for high availability Yes No Yes Yes
Supports query scale out Yes No Yes Yes
Dynamic scalability (scale up) Yes No Yes Yes

Contributors

This article is maintained by Microsoft. It was originally written by the following contributors.

Principal author:

  • Zoiner Tejada | CEO and Architect
  • Columnstore indexes: Overview
  • Create an Analysis Services server
  • What is Azure Data Factory?
  • What is Power BI?

Related resources

  • Big data architecture style

Was this page helpful?

Additional resources

  • Engineering Mathematics
  • Discrete Mathematics
  • Operating System
  • Computer Networks
  • Digital Logic and Design
  • C Programming
  • Data Structures
  • Theory of Computation
  • Compiler Design
  • Computer Org and Architecture

Online Transaction Processing (OLTP) and Online Analytic Processing (OLAP)

case study using olap

Benefits of using OLTP services:-

  • The main benefit of using OLTP services is it responds to its user actions immediately as it can process query very quickly.
  • OLTP services allows its user to perform operations like read, write and delete data quickly.
  • Consistency: OLTP services ensure the consistency of data in real-time. Any changes made by one user will reflect immediately and accurately for all other users.
  • Data Integrity: OLTP services maintain data integrity by validating the input data to ensure that it conforms to the specified rules and constraints.
  • High Availability: OLTP services ensure high availability by providing real-time access to the data. They are designed to handle a large number of users and transactions without affecting the system’s performance.
  • Scalability: OLTP services are highly scalable and can handle an increasing number of users and transactions. This makes them ideal for applications that require real-time access to the data.
  • Security: OLTP services provide high levels of security by implementing various security features such as authentication, authorization, and encryption to ensure that only authorized users can access the data.
  • Better Decision Making: OLTP services provide real-time access to data, allowing users to make better and informed decisions. This is because the data is accurate, up-to-date, and reliable.

Drawbacks of OLTP service:-

  • The major problem with the OLAP services is it is not fail-safe . If there is hardware failures, then online transactions gets affected.
  • OLTP allow users to access and change the data at the same time which cause unprecedented situation.
  • Limited analysis capabilities: OLTP systems are designed to handle operational tasks and not intended for complex analysis or reporting. They lack the ability to aggregate and analyze large amounts of data quickly.
  • Limited scalability: OLTP systems are not easily scalable and may require significant infrastructure changes to handle increased transaction loads. This can result in costly downtime and impact on user experience.
  • Data integrity issues: With many users accessing and modifying data concurrently, OLTP systems may experience issues with data integrity, such as duplicate or inconsistent data.
  • High maintenance costs: OLTP systems require frequent maintenance, including backup and recovery procedures, to ensure data is not lost and the system is available to users at all times. This can result in high maintenance costs for organizations.

case study using olap

Benefits of using OLAP services:-

  • The main benefit of using OLAP services is it helps to keep trace of consistency and calculation.
  • OLAP builds one single platform where we can store planning, analysis and budgeting for business analytics.
  • With the OLAP as service, we can easily apply security restrictions to protect data.
  • OLAP services provide a unified view of data from different sources, making it easier for business analysts to access and analyze data.
  • OLAP services enable complex queries and data analysis by providing a multidimensional view of data, allowing users to slice and dice data in various ways.
  • OLAP services support data mining and predictive analytics by providing access to historical data and trends, allowing users to identify patterns and make informed business decisions.
  • OLAP services can handle large volumes of data, making it suitable for enterprise-level business intelligence applications.
  • OLAP services can be integrated with other tools and applications, such as reporting and visualization tools, to provide a complete business intelligence solution.

Drawbacks of OLAP service:-

  • The major problem with the OLAP services is it always needs IT professionals to handle the data because OLAP tools require a complicated modeling procedure.
  • As mentioned in the benefits of using OLAP services, we can use OLAP as a single platform where we can store planning, analysis and budgeting for business analytics but here we need help of different departments at one time i.e., OLAP tools need cooperation between people of various departments, which leads dependency problem.
  • OLAP services can be very expensive to implement and maintain, especially for large datasets.
  • There may be a delay in data availability as data needs to be extracted, transformed, and loaded into the OLAP system before it can be analyzed.
  • OLAP services are optimized for read-heavy workloads, so write operations may be slower or less efficient.
  • OLAP services may not be suitable for real-time analysis or decision-making as data is typically updated on a periodic basis.  

The key differences between OLTP and OLAP databases:

OLTP OLAP
OLTP is characterized by a large number of short on-line transactions (INSERT, UPDATE, DELETE). OLAP is characterized by relatively low volume of transactions.
OLTP queries are simple and easy to understand. OLAP Queries are often very complex and involve aggregations.
OLTP is widely used for small transaction. OLAP applications are widely used by Data Mining techniques.
OLTP is highly normalized. OLAP is typically de-normalized.
OLTP is used for Backup religiously. OLAP is used for regular backup.
OLTP usually uses schema used to store transnational databases is the entity model (usually 3NF). OLAP uses star model to store the data.
Performance of OLTP is comparably fast as compared to OLAP. Performance of OLAP is comparably low as compared to OLTP.

OLTP and OLAP services are different from each other, therefore, it is wise to look into the differences and use them wisely as per your application/need demands.

author

Please Login to comment...

Similar reads.

  • Technical Scripter
  • Technical Scripter 2019
  • How to Delete Discord Servers: Step by Step Guide
  • Google increases YouTube Premium price in India: Check our the latest plans
  • California Lawmakers Pass Bill to Limit AI Replicas
  • Best 10 IPTV Service Providers in Germany
  • 15 Most Important Aptitude Topics For Placements [2024]

Improve your Coding Skills with Practice

 alt=

What kind of Experience do you want to share?

case study using olap

OLAP systems - Definition, characteristics & use cases

The key to any business success is in improving the decision-making process with the introduction of business intelligence; the quick and easy access to information that should be found in data warehouses with the potential for generating multi-dimensional queries.

Common functions of business intelligence technologies are: reporting, online analytical processing (OLAP), data search, discovery of knowledge in data (data mining), business performance management, benchmarking, text mining and predictive analytics.

In this article, we’ll describe the technology of interactive analytical processing — OLAP, which is a relatively young technology with great potential for application in the business environment of start-ups and small to medium-sized enterprises.

What does OLAP mean? What does OLAP mean?

The concept of OLAP is based on the principle of rich data representation. In 1993, the term OLAP (online analytical processing) was introduced by Edgar Codd. Taking into account the shortcomings of relational models, he indicated the impossibility of connecting, viewing and analyzing data with respect to the multiplication abilities (which are required in corporate analytics). Thus he defined the general requirements for the OLAP system — increasing the functionality of relational DBMS and introducing multi-dimensional conceptual view.

In a large number of publications, the abbreviation OLAP stands for a rich data view and data preservation to compose a rich database. Codd declared that “relational databases are the most suitable technology” for collecting corporate data. There is no need for new DB technologies, but rather, to complement the basic DBMS functions with data transfer and automated data analyses.

OLAP system — functions & features OLAP system — functions & features

Together with the multidimensional conceptual view — the OLAP technology possesses the following features and functions:

  • Slice and dice technique — slicing takes a single value from one of its dimensions and creates a subset in the cube (3-dimensional array of data). For example with the sales cube, we can slice out certain years in a new data cube. Dicing choses certain values of the multi-cube and results in a new sub-cube.

case study using olap

Intuitive data manipulation — means direct action on the data cell, without accessing menus or multiple actions. This caused a lot of resistance from OLAP tools manufacturers.

Using OLAP as an intermediate layer — emphasizes the usage of the OLAP engine as an intermediary between the data sources and the OLAP applications.

Background (batch), or interpreter access, by supporting the both variants — background access via the stage table and direct access to external data. Today, this is overcome by using a hybrid approach (hybrid OLAP, HOLAP).

OLAP analysis models, which include support for four analysis models that can be described as: parametric static reporting, “slicing and dicing” with data drilling, “what-if” and “goal seek” models.

Client-server architecture implies that the server component can support several different types of client applications.

Transparency — implies that the user receives all the necessary data, without the need to know where the data comes from. Direct data loading from multiple applications is supported.

Support for multiple users, i.e. concurrent multi-user access, while preserving data security and integrity

Benefits of OLAP Benefits of OLAP

The main benefits of the OLAP systems are:

Providing visual information and analysis of results. The purpose of the OLAP system is to easily process the information and find the logical connection between the data and the obtained results. Thus, subjectivity of the analysis is reduced.

OLAP technology enables the users to access and analyze “ad-hoc” data, review information obtained through comparisons, analysis of past data and data derived from various “what-if” scenarios.

OLAP applications are used by analysts, managers who often require an overview of highly aggregated data, such as total sales of one production line of the company, or one region of the country, etc.

The OLAP database (or analytical DB) is subject to changes caused by multiple data sources, and provides a powerful analytical backend to many user applications.

In contrast to SQL queries, that work over the transactional database and which provide an answer to the DB inquiries, OLAP goes a step further in the sphere of “data meaning”, providing answers to questions of — whether assumptions that have been extracted from the database data are true.

OLAP also creates a series of hypothetical patterns and relationships between data and uses query systems — to confirm or disprove them.

Creating a Unified platform. For the stability of the OLAP system, it is possible to create a unified platform for all predictive analytics in the company.

Successful OLAP use cases Successful OLAP use cases

Financial sector applications  — OLAP applications can be applied in different business domains in order to ease the planning, budgeting, analyses and financial reporting. With proper utilization of OLAP, powerful reports can be created. These reports, in turn, can be very helpful for organizations that want to achieve better efficiency and effectiveness. A multidimensional database (MDDB) is responsible for executing all OLAP requests. Control panels (dashboards) provide visual representation of the most important information needed to achieve a company’s goals. This information is consolidated and displayed on one screen, so it is possible to easily monitor the status of the entire organization.

Sales applications  — Sales teams should primarily be focused on the revenue category, not the profit category. The main goal of applying OLAP reporting in the sales area is to align sales activities with the corporate goal of increasing profits. The main problem when it comes to sales reporting is the time required to collect data, analyze, create and distribute reports. That time can be measured in hours, and often in days, however OLAP reports allow sales teams to quickly manipulate sales-related information.

Marketing use case  — OLAP reporting in the marketing sphere allows companies to view customers in the same way. Using a rich set of algorithms and search tools, users can obtain useful information such as buying habits, sales forecasts, key influencers, market movements and campaign effectiveness.

In this article, we’ll talk about:

  • What does OLAP mean?
  • OLAP system — functions and features
  • Benefits of OLAP
  • Successful OLAP use cases

Start your trial today

case study using olap

Big data analytics in ad tech: Discover the opportunities & talking points

case study using olap

Understanding business analytics

case study using olap

What is web analytics? The benefits & use cases

Illustration with collage of pictograms of clouds, pie chart, graph pictograms

OLAP, or online analytical processing, is technology for performing high-speed complex queries or multidimensional analysis on large volumes of data in a  data warehouse , data lake  or other data repository. OLAP is used in  business intelligence (BI) , decision support, and a variety of business forecasting and reporting applications. 

Most business data have multiple dimensions—multiple categories into which the data are broken down for presentation, tracking, or analysis. For example, sales figures might have several dimensions related to location (region, country, state/province, store), time (year, month, week, day), product (clothing, men/women/children, brand, type), and more.

But in a data warehouse or data lake, data sets are stored in tables, each of which can organize data into just two of these dimensions at a time. OLAP extracts data from multiple relational data sets and reorganizes it into a multidimensional format that enables very fast processing and very insightful analysis. 

Explore the free O'Reilly ebook to learn how to get started with Presto, the open source SQL engine for data analytics.

Register for the IDC report

The core of most OLAP systems, the OLAP cube is an array-based multidimensional database that makes it possible to process and analyze multiple data dimensions much more quickly and efficiently than a traditional relational database .

A relational database table is structured like a spreadsheet, storing individual records in a two-dimensional, row-by-column format. Each data “fact” in the database sits at the intersection of two dimensions–a row and a column—such as region and total sales.

SQL and relational database reporting tools can certainly query, report on, and analyze multidimensional data stored in tables, but performance slows down as the data volumes increase. And it requires a lot of work to reorganize the results to focus on different dimensions.

This is where the OLAP cube comes in. The OLAP cube extends the single table with additional layers, each adding additional dimensions—usually the next level in the “concept hierarchy” of the dimension. For example, the top layer of the cube might organize sales by region; additional layers could be country, state/province, city and even specific store.

In theory, a cube can contain an infinite number of layers. (An OLAP cube representing more than three dimensions is sometimes called a hypercube.) And smaller cubes can exist within layers—for example, each store layer could contain cubes arranging sales by salesperson and product. In practice, data analysts will create OLAP cubes containing just the layers they need, for optimal analysis and performance. 

The drill-down operation converts less-detailed data into more-detailed data through one of two methods—moving down in the concept hierarchy or adding a new dimension to the cube. For example, if you view sales data for an organization’s calendar or fiscal quarter, you can drill-down to see sales for each month, moving down in the concept hierarchy of the “time” dimension.

Roll up is the opposite of the drill-down function—it aggregates data on an OLAP cube by moving up in the concept hierarchy or by reducing the number of dimensions. For example, you could move up in the concept hierarchy of the “location” dimension by viewing each country's data, rather than each city.

Slice and dice

The slice operation creates a sub-cube by selecting a single dimension from the main OLAP cube. For example, you can perform a slice by highlighting all data for the organization's first fiscal or calendar quarter (time dimension).

The dice operation isolates a sub-cube by selecting several dimensions within the main OLAP cube. For example, you could perform a dice operation by highlighting all data by an organization’s calendar or fiscal quarters (time dimension) and within the U.S. and Canada (location dimension).

The pivot function rotates the current cube view to display a new representation of the data—enabling dynamic multidimensional views of data. The OLAP pivot function is comparable to the pivot table feature in spreadsheet software, such as Microsoft Excel, but while pivot tables in Excel can be challenging, OLAP pivots are relatively easier to use (less expertise is required) and have a faster response time and query performance.

OLAP that works directly with a multidimensional OLAP cube is known as multidimensional OLAP,  or MOLAP . Again, for most uses, MOLAP is the fastest and most practical type of multidimensional data analysis.

However, there are two other types of OLAP which may be preferable in certain cases:

ROLAP , or relational OLAP,  is multidimensional data analysis that operates directly on data on relational tables, without first reorganizing the data into a cube.

As noted previously, SQL is a perfectly capable tool for multidimensional queries, reporting, and analysis. But the SQL queries required are complex, performance can drag, and the resulting view of the data is static—it can't be pivoted to represent a different view of the data. ROLAP is best when the ability to work directly with large amounts of data is more important than performance and flexibility.

HOLAP , or hybrid OLAP , attempts to create the optimal division of labor between relational and multidimensional databases within a single OLAP architecture. The relational tables contain larger quantities of data, and OLAP cubes are used for aggregations and speculative processing. HOLAP requires an OLAP server that supports both MOLAP and ROLAP.

A HOLAP tool can "drill through" the data cube to the relational tables, which paves the way for quick data processing and flexible access. This hybrid system can offer better scalability but can't escape the inevitable slow-down when accessing relational data sources. Also, its complex architecture typically requires more frequent updates and maintenance, as it must store and process all the data from relational databases and multidimensional databases. For this reason, HOLAP can end up being more expensive.

Online transaction processing , or OLTP , refers to data-processing methods and software focused on transaction-oriented data and applications. 

The main difference between OLAP and OLTP is in the name: OLAP is analytical in nature, and OLTP is transactional. 

OLAP tools are designed for multidimensional analysis of data in a data warehouse, which contains both transactional and historical data. In fact, an OLAP server is typically the middle, analytical tier of a data warehousing solution. Common uses of OLAP include data mining and other business intelligence applications, complex analytical calculations, and predictive scenarios, as well as business reporting functions like financial analysis, budgeting, and forecast planning.

OLTP is designed to support transaction-oriented applications by processing recent transactions as quickly and accurately as possible. Common uses of OLTP include ATMs, e-commerce software, credit card payment processing, online bookings, reservation systems, and record-keeping tools.

For a deep dive into the differences between these approaches, check out " OLAP vs. OLTP: What's the Difference? "

OLAP enables companies to maximize the potential of their corporate data by transforming it into the most practical format for multidimensional analysis. This, in turn, makes it easier to discern valuable business insights. However, if these systems are kept in-house, it limits the potential for scaling.

Cloud-based OLAP services are less expensive and easier to set up, making them more attractive for small businesses or startups on a budget. Enterprises can tap into the vast potential of cloud-based data warehouses that perform sophisticated analytics at unrivaled speeds because they use massively parallel processing (MPP). Therefore, companies can use OLAP at cloud speed and scale, analyzing vast amounts of data without moving it from their cloud data warehouse.

Constance Hotels, Resorts & Golf is a luxury hotel group with nine properties on islands in the Indian Ocean. However, a lack of island-to-island communications gave way to organizational silos, with business data isolated in each resort. The organization built a cloud data warehouse and analytics architecture to link all on-premises systems and tools with a central cloud-based data repository. In doing this, the company gained the group-wide insight they needed to leverage advanced, predictive analytics and implement an OLAP system.

OLAP in cloud architecture is a fast and cost-effective solution built for the future. Once the cubes are made, teams can use existing business intelligence tools to instantly connect with the OLAP model and draw interactive real-time insights from their cloud data.

IBM Db2 Warehouse on Cloud is a managed public cloud service. You can set up IBM Db2 Warehouse on premises with your own hardware or in a private cloud.

IBM DB2 Warehouse integrates and simplifies the data warehouse environment delivering dynamic warehousing and providing direct support for OLAP against the data warehouse.

Accelerate innovation and drive business outcomes by turning data into insights.

These terms are often confused for one another, so what are their key differences and how do you choose the right one for your situation?

Scale AI workloads for all your data, anywhere, with IBM watsonx.data, a fit-for-purpose data store built on an open data lakehouse architecture.

ExLibris Esploro

Using OLAP Tools for e-HRM: A case study

IGI Global

  • University of Campinas

Carmen Freitas at University of Campinas

  • This person is not on ResearchGate, or hasn't claimed this research yet.

Discover the world's research

  • 25+ million members
  • 160+ million publication pages
  • 2.3+ billion citations

Friedrich Röhrs

  • Ashish Chandra Jha
  • Sanjeev Kumar Jha

J. B. Simha

  • Alenka Zabukovec

Jurij Jaklič

  • William H Inmon
  • W. H. Inmon
  • Warren Thornthwaite
  • E.W.T. Ngai
  • F. K. T. Wat
  • Hum Resource Manag Rev

Ronald J Burke

  • Fabio Casati

Malú Castellanos

  • Ming-Chien Shan

Mark Sifer

  • Stephen Few
  • Recruit researchers
  • Join for free
  • Login Email Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google Welcome back! Please log in. Email · Hint Tip: Most researchers use their institutional email address as their ResearchGate login Password Forgot password? Keep me logged in Log in or Continue with Google No account? Sign up

Using OLAP Tools for e-HRM: A Case Study

New citation alert added.

This alert has been successfully added and will be sent to:

You will be notified whenever a record that you have chosen has been cited.

To manage your alert preferences, click on the button below.

New Citation Alert!

Please log in to your account

Information & Contributors

Bibliometrics & citations, view options.

  • Mola L Rossignoli C Carugati A Giangreco A (2018) Business Intelligence System Design and its Consequences for Knowledge Sharing, Collaboration, and Decision-Making International Journal of Technology and Human Interaction 10.5555/2807124.2807125 11 :4 (1-25) Online publication date: 16-Dec-2018 https://dl.acm.org/doi/10.5555/2807124.2807125

Recommendations

Exploring the impact of essential it skills on career satisfaction and organisational commitment of information systems professionals.

Knowledge competency and career satisfaction are essential ingredients that increase organisational commitment of key information systems professionals. The study suggested that organisational knowledge and skills both asserted a positive influence on ...

EXPERIENCE: Succeeding at Data Management—BigCo Attempts to Leverage Data

In a manner similar to most organizations, BigCompany (BigCo) was determined to benefit strategically from its widely recognized and vast quantities of data. (U.S. government agencies make regular visits to BigCo to learn from its experiences in this ...

IT HRM practices: best practices vs. configurations

The management of information technology (IT) professionals is an important managerial concern. The way that IT organizations manage their IT professionals is essentially manifested in their implementation of human resource management (HRM) practices, ...

Information

Published in.

United States

Publication History

Author tags.

  • Absenteeism
  • Business Intelligence
  • Data Warehousing
  • Human Resource Management
  • Online Analytical Processing
  • Workforce Analysis

Contributors

Other metrics, bibliometrics, article metrics.

  • 1 Total Citations View Citations
  • 0 Total Downloads
  • Downloads (Last 12 months) 0
  • Downloads (Last 6 weeks) 0

View options

Login options.

Check if you have access through your login credentials or your institution to get full access on this article.

Full Access

Share this publication link.

Copying failed.

Share on social media

Affiliations, export citations.

  • Please download or close your previous search result export first before starting a new bulk export. Preview is not available. By clicking download, a status dialog will open to start the export process. The process may take a few minutes but once it finishes a file will be downloadable from your browser. You may continue to browse the DL while the export process is in progress. Download
  • Download citation
  • Copy citation

We are preparing your search results for download ...

We will inform you here when the file is ready.

Your file of search results citations is now ready.

Your search export query has expired. Please try again.

Academia.edu no longer supports Internet Explorer.

To browse Academia.edu and the wider internet faster and more securely, please take a few seconds to  upgrade your browser .

Enter the email address you signed up with and we'll email you a reset link.

  • We're Hiring!
  • Help Center

paper cover thumbnail

Using OLAP Tools for e-HRM: A Case Study

Profile image of Alysson Prado

Related Papers

Alaa Hamoud

The organizations always need to manage their operations, process the data electronically, and find a platform that help them to support their strategic decisions. The success of the human resource department can reflect overall organizational success. The human resource department professionals try to ensure finding the right person at the right time for the job that fits the person according to skills and qualifications. This task needs a platform that supports making the right decision based on historical managerial information. The data mart is a departmental based decision support system that used departmental data to help decision-makers to support short term decisions. Human resource (HR) data mart is the base stone for building an enterprise data warehouse. The paper presents the implementation process of HR data mart starting from implementing data mart schema to online analytical processing (OLAP) reports. The data mart is implemented on retired employees' data of Basra Oil Company for over 15 years. A human resource data mart can provide a base platform to perform a different analysis operation to support the right decisions. Different OLAP reports are implemented to help analysts and decision-makers to get the answers for their questions as OLAP queries. Two categories of reports are implemented offline reports using Microsoft Excel Pivot Table 2010 and web OLAP reports using SQL Server Reporting Service 2014 (SSRS). The tools used to implement data mart vary from SQL Server Management Services (SSMS) 2014, SQL Server Integration Service 2014 (SSIS), SQL Server Analytical Service 2014 (SSAS), SQL Server Reporting Service 2014 (SSRS), SQL Server Data Tools 2013 (SSDT), and Microsoft Excel Pivot Table 2010.

case study using olap

Global Trends in Human Resource Management

Yitzhak Fried

Emmanuel C Navarro

This study focused on the system development and assessment of a “Human Resource Information Management System.” It aimed to describe its developments using the Agile Software Development Method in terms of analysis, design, development, testing and evaluation. It also aimed in assessing its system qualities in terms of technicality, functionality and usability both by IT experts and HR practitioners as end-users based on the ISO/IEC 25010:2011 standards. The system was developed either for public or private organization. This study was conceptualized to further enhance a system that will consolidate the employees’ records. Its functionalities included the maintenance of employees’ records, attendance and leave monitoring, class schedules, related to work email messages notification for employees, ranking and evaluation, reduce paper management of employee’s records, generate service records, generate a pre-formatted PDS records compliant to recent Civil Service Commission format an...

IAEME PUBLICATION

IAEME Publication

In the onset of 21 st century, every day we witness change and up gradation of skills in masses. Human resources today lay stress on aligning human resource management with and organizations overall business goals and objectives. Human resources today combine itself with Artificial Intelligence to eliminate repetitive tasks, accelerate the search for talent, reduce employee attrition and improve employee engagement Human resource management is the most important component for a business or an organization to be successful. Human resource management role is to recruit, coordinate and train people in such a strategic way so as to get the maximum advantage from their work related activities. The human resource professionals are type of middleman in a business system who advises the managers and the management on issues related to assigning employees different responsibilities in an organization so that effective utilization of resources can be there. In an organization employees are often shifted from one department to another taking into consideration their skill factor. Over the past 3 decades there has been a tremendous change in Human Resource Management under the influence of new technology. HR processes have changed the way in which how companies collect, store, use and circulate information about employees. The process of recruitment has seen an exhaustive change , gone are the days when job boards and print media was used , electronic recruiting is the new trend in technology leading to qualified, diverse and motivated applicant job pool. With the help of e-recruitment HR gets more variety of employees to choose from. More than 90% of the large companies today are using more than one form of technology to advertise jobs. This current mentioned trend gives birth to a more advanced technology which helps us select the best suited applicant from the variety gathered with the help of e-recruitment i.e Artificial intelligence.

The paper presents a solution for integrating many aspects of human resource management, with extensible customization possibility (language, interface), to meets specific needs of various organizations, both regarding the organization as a whole, but also in terms of different categories of users that can use the platform. It also provides an overview and perspective on using Open Source Software in human resource management.

Sergey Zykov

Human resources management systems are having a wide audience at present. However, no truly integrate solution has been proposed yet to improve the systems concerned. Exist- ing approaches classification attempt is made in this paper. Possible approaches to extra data collection for decision- making are considered including psychological testing and fixed assets information as well as product sales data. Con-

International Journal of Human Resource Studies

Dr. Puja Sareen

Today, HR is not treated as a single function. It’s a collection of highly specialized capabilities — each with distinct objectives, tasks and needs. There is an ever-increasing pressure on Human Resource (HR) function to support strategic goals and to focus on value adding activities. Organizations have realized the growing importance of using Information Technology (IT) in leveraging their Human Resource (HR) functions. This takes the form of e-HRM (Electronic Human Resource Management). The e-HRM revolution relies on cutting-edge information technology, ranging from Internet-enabled human resources information systems (HRIS) to corporate intranets and portals. The driving forces are intensifying competition, need to manage workforce on a global level, to improve HR service delivery and to bring cost savings. e-HRM enables HR leaders to become architects in the development of competitive organizational social systems.This paper reviews the research work done in the field of e-HRM....

The days of campus selection, scanning job boards, advertising, open houses for recruitment are obsolete. Organizations are not gaining any competitive advantage or these methods covering and expressing future talent demands. To overcome this obsolete trend, "People Analytics" in the age of 'Big Data' is doing a great job of melding analytical process and methods with the large data that is available today. Big data is an ocean in which diving with predictive analytics, fishing gear virtually ensure that we will catch what you are fishing for. There is no shortage of qualified personnel in any company. But, we what we require is a world-class talent acquisition system. This is where predictive analysis system comes to our rescue. If we observe, there is a shortage of 21st-century talent acquisition strategies. Mere recruitment doesn't mean the selection of qualified and talented personnel. The employment methods, by whatever labels in vogue, are simply functional tools. It is observed that for decades, the organizational purpose of recruitment is often not fully addressed. Everything starts with the organization's purpose, goals and strategic plans. Management must first be clear, that, data and analytics can be brought together in forming a strategy. As we know, Big data is the talent ocean, Analytics is the fishing gear. Analytics help management, find the school of a It is the era to be motivated to learn some of the latest techniques and best practices of how to use different types of human resource across the enterprise. In this context, it is imperative to study the 'People Analytics' in the age of Big Data. This paper focuses on the international experiences on People Analytics from select countries and its growing relevance in HR of select companies in India.

Marek Miłosz , Sergio Luján-Mora

Advances in Science, Technology and Engineering Systems Journal

Chelsea Adora

Loading Preview

Sorry, preview is currently unavailable. You can download the paper by clicking the button above.

RELATED PAPERS

iJSRED Journal

Shayista Majeed

shubham chaudhari

Design and implementation of a web based human resouce management system

Nafisat Sanni

Inoka Gunawardena

International Journal of Engineering and Emerging Technology

Muhammad Hafiz Anshari

International Journal For Multidisciplinary Research

Karan Masta

Mark Okinyi

SAADIQ ALI SAYYAD

INTERNATIONAL JOURNAL OF ENGINEERING TECHNOLOGIES AND MANAGEMENT RESEARCH I J E T M R JOURNAL

Rashika Shukla

Qubahan Academic Journal

Jake Pomperada

MADHAVAIAH CHENDRAGIRI

Cristina Tenovici

International Journal of Business and Management

Hayel T Ababneh

Periodicals of Engineering and Natural Sciences (PEN)

Hanan Shukur

pradeep jadhav

International Research Journal of Modernization in Engineering Technology and Science (IRJMETS)

Teodora Vatuiu

Formosa Journal of Multidisciplinary Research

Devi Marlita

On the Move to Meaningful Internet Systems: …

Maryati Mohd Yusof

RELATED TOPICS

  •   We're Hiring!
  •   Help Center
  • Find new research papers in:
  • Health Sciences
  • Earth Sciences
  • Cognitive Science
  • Mathematics
  • Computer Science
  • Academia ©2024

User Story Based Automated Test Case Generation Using NLP

  • Conference paper
  • First Online: 30 August 2024
  • Cite this conference paper

case study using olap

  • Arunkumar Chinnaswamy 19 ,
  • B. A. Sabarish 19 &
  • R. Deepak Menan 19  

Part of the book series: IFIP Advances in Information and Communication Technology ((IFIPAICT,volume 717))

Included in the following conference series:

  • International Conference on Computational Intelligence in Data Science

The progress of technology requires software systems to be of higher quality in order to meet the increasing complexity and frequency of changing needs. The present software development life cycle prioritizes the adjustment to evolving client requirements across the different stages of project development, facilitated by Continuous Integration and Continuous Deployment. The process produces a substantial volume of data that can serve as a valuable resource for automating test case production and reducing the need for manual intervention. This publication presents a suggested technique that utilizes natural language processing to automate the generation of test cases, hence minimizing the need for human involvement. The proposed approach has three phases: input-output categorization utilizing sentiment analysis, production of regular expressions, and generation of test cases. The main contribution of this article involves the classification of user keywords and the construction of test cases using them. The suggested model generates diverse outputs to create both positive and negative test cases. It has been tested with 700 user stories that have varying levels of abstraction in articulating the requirements.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save.

  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
  • Available as EPUB and PDF
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Arnaoudova, V., Haiduc, S., Marcus, A., Antoniol, G.: The use of text retrieval and natural language processing in software engineering. In: 2015 IEEE/ACM37th IEEE International Conference on Software Engineering, vol. 2, pp. 949–950. IEEE (2015)

Google Scholar  

Hemmati, H., Sharifi, F.: Investigating NLP-based approaches for predicting manual test case failure. In: 2018 IEEE 11th International Conference on Software Testing, Verification and Validation (ICST), pp. 309–319. IEEE (2018)

Ding, Z.: Towards utilizing natural language processing techniques to assist in software engineering tasks. In: 2023 IEEE/ACM 45th International Conference on Software Engineering: Companion Proceedings (ICSECompanion), pp. 286–290. IEEE (2023)

Verma, R.P., Beg, M.R.: Generation of test cases from software requirements using natural language processing. In: 2013 6th International Conference on Emerging Trends in Engineering and Technology, pp. 140–147. IEEE (2013)

Ahsan, I., et al.: A comprehensive investigation of natural language processing techniques and tools to generate automated test cases. In: Proceedings of the Second International Conference on Internet of things, Data and Cloud Computing, pp.1–10 (2017)

Zhang, J., Panthaplackel, S., Nie, P., Li, J.J., Gligoric, M.: CoditT5: pretraining for source code and natural language editing. In: Proceedings of the 37th IEEE/ACM International Conference on Automated Software Engineering, pp. 1–12 (2022)

Ahmad, A., Waseem, M., Liang, P., Fahmideh, M., Aktar, M.S., Mikkonen, T.: Towards human-bot collaborative software architecting with ChatGPT. In: Proceedings of the 27th International Conference on Evaluation and Assessment in Software Engineering, pp. 279–285 (2023)

Pradhan, S., Ray, M., Swain, S.K.: Transition coverage based test case generation from state chart diagram. J. King Saud Univ.-Comput. Inf. Sci. 34 (3), 993–1002 (2022)

Wang, C., et al.: Automatic generation of system test cases from use case specifications: an NLP-based approach. arXiv preprint arXiv:1907.08490 (2019)

Fischbach, J., Vogelsang, A., Spies, D., Wehrle, A., Junker, M., Freudenstein, D.: Specmate: automated creation of test cases from acceptance criteria. In: ICST 2020, pp. 321–331 (2020)

Srinivas Perala, D.A.R.: A review on test automation for test cases generation using NLP techniques. Turk. J. Comput. Math. Educ. (TURCOMAT) 12 (6), 1488–1491 (2021)

Article   Google Scholar  

Tahvili, S., Hatvani, L., Ramentol, E., Pimentel, R., Afzal, W., Herrera, F.: A novel methodology to classify test cases using natural language processing and imbalanced learning. Eng. Appl. Artif. Intell.Artif. Intell. 95 , 103878 (2020)

Carvalho, G., et al.: NAT2TESTSCR: test case generation from natural language requirements based on SCR specifications. Sci. Comput. Program.Comput. Program. 95 , 275–297 (2014)

Rane, P.P.: Automatic generation of test cases for agile using natural language processing. Doctoral dissertation, Virginia Tech (2017)

Sutar, S., Kumar, R., Pai, S., Shwetha, B.R.: Regression test cases selection using natural language processing. In: 2020 International Conference on Intelligent Engineering and Management (ICIEM), pp. 301–305. IEEE (2020)

Mathur, A., Pradhan, S., Soni, P., Patel, D., Regunathan, R.: Automated test case generation using T5 and GPT-3. In: 2023 9th International Conference on Advanced Computing and Communication Systems (ICACCS), vol. 1, pp. 1986–1992. IEEE (2023)

Viggiato, M., Paas, D., Buzon, C., Bezemer, C.P.: Identifying similar test cases that are specified in natural language. IEEE Trans. Softw. Eng.Softw. Eng. 49 (3), 1027–1043 (2022)

Thushara, M.G., Dominic, N.: A template based checking and automated tagging algorithm for project documents. In: Second International Conference on Computing Paradigms (International Journal of Control Theory and Applications), vol. 9, no. 10. pp. 4537–4544 (2016)

Thushara, M.G., Krishnapriya, M.S., Nair, S.S.: A model for auto-tagging of research papers based on keyphrase extraction methods. In: 2017 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Udupi, India. IEEE (2017)

Madhuri Chandu, G.V., Premkumar, A., Sampath, N.: Extractive approach for query based text summarization. In: 2019 International Conference on Issues and Challenges in Intelligent Computing Techniques (ICICT) (2019)

Jayaram, K., Sangeeta, K.: A review: information extraction techniques from research papers. In: Proceedings of the IEEE International Conference on Innovative Mechanism for Industry Application, ICIMIA 2017, p. 56 (2017)

Download references

Author information

Authors and affiliations.

Amrita School of Computing, Coimbatore, India

Arunkumar Chinnaswamy, B. A. Sabarish & R. Deepak Menan

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Arunkumar Chinnaswamy .

Editor information

Editors and affiliations.

Wroclaw University of Economics and Business, Wrocław, Poland

Mieczyslaw Lech Owoc

Sri Sivasubramaniya Nadar College of Engineering, Chennai, Tamil Nadu, India

Felix Enigo Varghese Sicily

Kanchana Rajaram

Prabavathy Balasundaram

Ethics declarations

The authors have no competing interests to declare that are relevant to the content of this article.

Rights and permissions

Reprints and permissions

Copyright information

© 2024 IFIP International Federation for Information Processing

About this paper

Cite this paper.

Chinnaswamy, A., Sabarish, B.A., Deepak Menan, R. (2024). User Story Based Automated Test Case Generation Using NLP. In: Owoc, M.L., Varghese Sicily, F.E., Rajaram, K., Balasundaram, P. (eds) Computational Intelligence in Data Science. ICCIDS 2024. IFIP Advances in Information and Communication Technology, vol 717. Springer, Cham. https://doi.org/10.1007/978-3-031-69982-5_12

Download citation

DOI : https://doi.org/10.1007/978-3-031-69982-5_12

Published : 30 August 2024

Publisher Name : Springer, Cham

Print ISBN : 978-3-031-69981-8

Online ISBN : 978-3-031-69982-5

eBook Packages : Computer Science Computer Science (R0)

Share this paper

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Publish with us

Policies and ethics

  • Find a journal
  • Track your research

IMAGES

  1. Case Study Real Time Olap Cubes

    case study using olap

  2. OLAP 101: What it is and How to Apply it to Your Marketing

    case study using olap

  3. Case study- Real-time OLAP Cubes

    case study using olap

  4. OLAP 101: What it is and How to Apply it to Your Marketing

    case study using olap

  5. What Is OLAP? OLAP Defined

    case study using olap

  6. (PDF) P2P OLAP: Data model, implementation and case study

    case study using olap

VIDEO

  1. തിരുവനന്തപുരം LDC

  2. Documenting how to save data to an HDF5 file

  3. Double Flatbed Trailer Truck vs Speedbumps Train vs Cars Tractor vs Train BeamNG Drive

  4. OLAP Operations

  5. Event Photography Lastolite Hilite

  6. Session 106: "Case Study: Using AI to Leverage ILS Data

COMMENTS

  1. OLAP over Big COVID-19 Data: A Real-Life Case Study

    This paper focuses the attention on a real-life case study represented by the design, the development and the practice of OLAP tools over big COVID-19 data in Canada. The OLAP tools developed in this context are further enriched by machine learning procedures that magnify the mining effect. The contribution presented in this paper also embeds an implicit methodology for OLAP over big COVID-19 ...

  2. Understanding OLTP and OLAP: A Comprehensive Guide for Data ...

    Case Studies: (Bringing Concepts to Life) Imagine you're managing your personal finances using a banking app. Every day, you check your account balance, transfer funds between accounts, and pay ...

  3. OLAP: A Deep Dive into Online Analytical Processing

    OLAP and Big Data. A case study conducted by IBM reported that organizations implementing OLAP solutions experienced, on average, a 20% improvement in return on investment (ROI) within the first year. As organizations grapple with the influx of big data, OLAP systems must adapt to handle vast datasets. The intersection of OLAP and big data ...

  4. OLTP vs OLAP: Understanding the differences and use cases

    OLAP databases and data warehouses give analysts and decision-makers the ability to use custom reporting tools to turn data into information. Query failure in OLAP does not interrupt or delay transaction processing for customers, but it can delay or impact the accuracy of business intelligence insights. ETL: Bridging the gap between OLTP and OLAP

  5. Online Analytical Processing (OLAP)

    The illustration depicted in Fig. 19.1 shows a typical journey of data from an operational database, data warehousing and OLAP to data analytics. All of the previous chapters focus on the left-hand side, the transformation process from the operational database to the data warehouse, where we discuss in length the step-by-step transformation process of various cases, using extensive case studies.

  6. What is OLAP and OLTP, use-cases, comparison, and examples

    Supply chain analysis. In summary, OLTP systems are used for transactional processing, while OLAP systems are used for analytical processing. OLTP systems are optimized for fast read and write performance, while OLAP systems are optimized for fast query performance. Here is a list of some popular OLTP and OLAP tools:

  7. Cloud-Based OLAP over Big Data: Application Scenarios and Performance

    Following our previous research results, in this paper we provide two authoritative application scenarios that build on top of OLAP*, a middleware for parallel processing of OLAP queries that truly realizes effective and efficiently OLAP over Big Data. We have provided two authoritative case studies, namely parallel OLAP data cube processing and virtual OLAP data cube design, for which we also ...

  8. Towards Exploratory OLAP Over Linked Open Data

    In this paper, we describe a framework for so-called exploratory OLAP over RDF sources. We propose a system that uses a multidimensional schema of the OLAP cube expressed in RDF vocabularies. Based on this information the system is able to query data sources, extract and aggregate data, and build a cube.

  9. Multiple OLAP Reports and the Best Compromise in the ...

    In this section, we present the OLAP decision-making process using the BeCoMe method. Through the OLAP case study, we shall illustrate how the BeCoMe method can be used for decision-making by managers and executives under the conditions of multiple views on one issue. The aim of this case study was not to describe this management decision-making.

  10. A case study for data warehousing courseware

    The objective of this project is to develop a web-based interactive courseware to help data warehouse designers to enhance understanding of the key concepts of OLAP using a case study approach. The courseware will help users to understand the concepts of OLAP with practical examples. This courseware provides an opportunity for students to ...

  11. Online analytical processing (OLAP)

    Online analytical processing (OLAP) is a technology that organizes large business databases and supports complex analysis. It can be used to perform complex analytical queries without negatively affecting transactional systems. The databases that a business uses to store all its transactions and records are called online transaction processing ...

  12. Online Transaction Processing (OLTP) and Online Analytic Processing (OLAP)

    Online Transaction Processing (OLTP): OLTP databases are meant to be used to do many small transactions, and usually serve as a "single source of storage". An example of OLTP system is online movie ticket booking website. Suppose two persons at the same time wants to book the same seat for the same movie for same movie timing then in this case whoever will complete the transaction first ...

  13. OLAP systems

    The purpose of the OLAP system is to easily process the information and find the logical connection between the data and the obtained results. Thus, subjectivity of the analysis is reduced. OLAP technology enables the users to access and analyze "ad-hoc" data, review information obtained through comparisons, analysis of past data and data ...

  14. What is OLAP?

    The core of most OLAP systems, the OLAP cube is an array-based multidimensional database that makes it possible to process and analyze multiple data dimensions much more quickly and efficiently than a traditional relational database.. A relational database table is structured like a spreadsheet, storing individual records in a two-dimensional, row-by-column format.

  15. P2P OLAP: Data model, implementation and case study

    Revising a dimension instance allows to produce consistent aggregations when an OLAP query is answered at more than one node. We then describe an implementation of a P2P system for answering OLAP queries over a network of data warehouses. We apply our proposal to a real-world case study of an insurance group. Finally, we report the results of ...

  16. P2P OLAP: Data model, implementation and case study

    3.1. P2P OLAP overview. The model proposed in [2] considers each node as a peer in a cooperative query system, and allows each peer a high degree of autonomy. Each node involved in the system defines a context where it becomes the local peer, and all its dimensions and fact tables are considered local henceforth.

  17. A case study for data warehousing courseware

    The courseware will help users to understand the concepts of OLAP with practical examples. This courseware provides an opportunity for students to generate various summary reports from example data with the help of dropdown list on the web pages. In addition, the students can also work on exercises based on the examples provided in the courseware.

  18. Understanding OLAP (Online Analytical Processing): Practical ...

    By using OLAP for data analysis, retail businesses can gain the insights they need to understand their customers better, optimize their operations, and ultimately, drive greater success.

  19. Using OLAP Tools for e-HRM: A case study

    Several literary studies [2, 34], which address decision support systems (DSS), use the OLAP query language (OnLine Analytical Processing) for analytical real-time processing of large amounts of ...

  20. The Information Systems (IS) Role of Accountants: A Case Study of an On

    information available from various databases. This study reports a case of implementing an OLAP tool to build complex financial reports for the use of senior management. The case illustrates the importance of the IS role of accountants with the emergence of the "systems accounting" role and the benefits of OLAP to accountants. Introduction

  21. Using OLAP Tools for e-HRM: A Case Study

    This paper describes a project in which the authors built a Data Warehouse containing actual Human Resource data. This paper provides data models and shows their use through OLAP software and their presentation to end-users using a web portal. The authors also discuss the progress, and some obstacles of the project, from the IT staff's viewpoint.

  22. Using OLAP Tools for e-HRM: A Case Study

    Using OLAP tools for e-HRM: a case study Alysson Bolognesi Prado, Carmen Freitas and Thiago Ricardo Sbrici Unicamp - State University of Campinas Human Resources Department - DGRH Campinas - Sao Paulo - Brazil {alysson,carmenf,thiago}@unicamp.br In the growing challenge of managing people, the Human Resources department needs effective ...

  23. User Story Based Automated Test Case Generation Using NLP

    6.1 Regular Expression Generation and Test Case Generation. A list of words extracted from word classification is used to match keywords, which are then mapped to a repository to generate regular expression for each input and output words as shown in Fig. 6(a). The NLP techniques presented above leverages to automatically produce test cases that align with each user story.