Get 50+ SAP BW on HANA Interview Questions and Answers
Last updated on 23rd Sep 2022, Blog, Interview Question
1. What is a SAP HANA?
Ans:
SAP HANA stands for a High Performance Analytical Appliance- in-memory computing engine. HANA is linked to an ERP systems; Frontend modeling studio can be used for the replication server management and load control.
2. What Is a Global Transfer Nile?
Ans:
This is a transfer routine (ABAP) explained at the info object level; this is common for all the source systems.
3. What is a SAP HANA optimized planning?
Ans:
SAP HANA-optimized an InfoCube is a standard InfoCube that is optimized for use with a SAP HANA. When create a SAP HANA-optimized InfoCubes, can assign a characteristics and key figures to dimensions. The system does not create any dimension tables apart from package dimension however.
4. What are the various perspectives in SAP HANA Studio?
Ans:
- SAP HANA Administration Console
- SAP HANA Modeler
- Application Development
- Lifecycle Management
- SAP HANA Development
- ABAP
- BW Modeling
5. What is the role of the persistence layer in a SAP HANA?
Ans:
SAP HANA has an in-memory computing engine and access a data straightaway without any backup. To avoid risk of losing data in case of hardware failure or power cutoff, persistence layer comes as savior and stores all data in a hard drive which is not volatile.
6. What is Data Warehouse Mixed Architecture?
Ans:
It’s an SAP best practice for a modern data warehousing. Simply put, it’s a data model that is implemented at a same time in SAP BW and native SAP HANA.
7. How does a SAP HANA support Massively Parallel Processing?
Ans:
With the availability of a Multi-Core CPUs, a higher CPU execution speeds can be achieved.Also, HANA Column-based on storage makes it easy to execute operations in parallel using a multiple processor cores. In a column store data is already vertically be partitioned. This means that operations on various columns can easily be processed in a parallel. If multiple columns need to be searched or aggregated, each of these operations can be assigned to a various processor core.In addition, operations on one column can be parallelized by a partitioning the column into a multiple sections that can be processed by a various processor cores. With the SAP HANA database, queries can be executed rapidly and in a parallel.
8. What is a data flow in BW/BI?
Ans:
Data flows from a transactional system to an analytical system (BW). DS ( Data Service) on the transactional system needs to be replicated on a B side and attached to infosource and update rules respectively.
9. What is the Use of a lean and dedicated layers in BW/4HANA?
Ans:
A lean Architecture needs a Dynamic Models, LSA++ (Layered Scalable Architecture) for a SAP BW/4HANA and SAP BW powered by HANA, Dynamic Dimensional.
10. What is source system?
Ans:
Any system that is sending a data to BW like R/3, lat file, oracle database, or an external systems.
11. How would optimize the dimension in BW?
Ans:
A dimension in BW is a collection of reference data about a measurable event in data warehousing. In this context, events are known as a “facts”. For example, a customer dimension’s attributes could include a first and last name, gender, birth date etc. To optimize dimensions, do not add most dynamic characteristics into a same dimension and make the dimension smaller. Also, explain as many dimensions as possible, and the dimension should not exceed 20% of fact table size.
12. What is a B/W statistics and how it is used?
Ans:
The sets of cubes delivered by a SAP are used to measure a performance for query, loading data etc. B/W statistics as name suggests is useful in showing data about costs associated with the B/W queries, OLAP, aggregative data etc. It is useful to measure performance of how quickly the queries are calculated or how quickly the data is loaded into a BW.
13. What is the ODS (Operational Data Store)?
Ans:
‘Operational Data Store’ or ‘ODS’ is used for a detailed storage of data. It is a BW architectural component that appears between the PSA ( Persistent Staging Area) and infocubes, it allows a BEX (Business Explorer) reporting. It is primarily used for detail reporting rather than a dimensional analysis, and it is not based on a star schema. ODS (Operational Data Store) objects do not aggregate data as a infocubes do. To load the data into IDS object, new records are inserted, existing records are updated, or old records are deleted as specified by a RECORDMODE value.
14. What is the Visual BI Document Management for SAP Lumira?
Ans:
Visual BI’s Document Management for SAP Lumira is small service that integrates with a SAP BusinessObjects platform.
15. What is advantage of a SLT replication?
Ans:
- SAP SLT works on a trigger based approach; such approach has no measurable performance impact in a source system.
- It offers a filtering capability and transformation.
- It enables a real-time data replication, replicating only related data into a HANA from non-SAP and SAP source systems.
- It is fully integrated with the HANA studios.
- Replication from several source systems to one HANA system is allowed, also from one source system to a multiple HANA systems is allowed.
16. What is a multi-provider in SAP BI? What are features of Multi-providers?
Ans:
- Multi-provider does not contain of any data.
- The data comes entirely from an info providers on which it is based.
- The info-providers are connected to a one another by a union operations.
- Info-providers and Multi-providers are objects or views relevant for reporting.
- A multi-provider allows to run reports using a several info-providers that are, it is used for creating a reports for one or more than one info-provider at a time.
17. What is a role of master controller job in a SAP HANA?
Ans:
- Creating a database triggers and logging table into a source system.
- Creating a Synonyms.
- Writing new entries in an admin tables in SLT server when a table is replicated/loaded.
18. What is Data Integration with the SAP BW/4HANA?
Ans:
BryteFlow for a SAP is a completely automated SAP data integration tool that extracts a SAP data from silos and replicates SAP data to cloud data warehouse, data lake, or operational data store including an AWS and Azure in real-time.
19. Name different components of SAP HANA?
Ans:
- SAP HANA DB
- SAP HANA Studio
- SAP HANA Appliance
- SAP HANA Application Cloud
20. Give example for data sources supporting this?
Ans:
2LIS_03_BF and 2LIS_03_UM.
21. What is relation between a BW-IP and the planning applications kit?
Ans:
22. What are the steps to load noncumulative cube?
Ans:
- Initialize a opening balance in R/3 (S278).
- Activate an extract structure MCO3BFO for a data source 2LIS_03_BF.
- Set up a historical material documents in R/3.
- load opening balance using a data source 2LIS_40_S278.
- load historical movements and compress without a marker update.
- Set up a V3 update.
- load deltas using a 2LIS_03_BF.
23. What are the types of attributes?
Ans:
Display only and a navigational; display only attributes are only for a display and no analysis can be done; navigational attributes behave like a regular characteristics; for example, assume that have customer characteristics with country as a navigational attribute can analyze the data using a customer and country.
24.In SAP BW/BI what are main areas and activities?
Ans:
Data Warehouse: Integrating, collecting and managing an entire company’s data.
Analyzing and Planning: Using a data stored in data warehouse.
Reporting: BI provides tools for reporting in a web-browser, Excel etc.
Broad cast publishing: To send information to the employees using email, fax etc.
Performance: Performance of company.
Security: Securing the access like using a SAP logon tickets from portal.
25. What is the time distribution option in an update rule?
Ans:
This is to distribute a data according to time; for example, if the source contains a calendar week and the target contains calendar day, the data is a spit for each calendar day. Here can select either the normal calendar or factory calendar.
26. What is Agile Data Modeling with the SAP BW/4 HANA?
Ans:
Modeling refers to an activity of refining or slicing data in a database tables by creating information views that depict business scenario. can use these information views for any reporting and a decision making purposes.
27.What is a global transfer, Nile?
Ans:
This is a transfer routine (ABAP) explained at the info object level; this is common for all the source systems.
28. What is the difference between a ODS and Info-cubes?
Ans:
- ODS has key while a Info-cubes does not have any key
- ODS contains a detailed level data and while Info-cube contains a refined data
- Info-cube follows a Star Schema (16 dimensions) while ODS is flat file structure
- There can be a two or more ODS under a cube, so cube can contain combined data or data that is derived from the other fields in the ODS
29.What is technically contained in appliance ?
Ans:
- SAP S/4HANA (a core ABAP backend incl. a SAP HANA database).
- SAP NW JAVA with the Adobe Document Services (for forms and output management).
- An SAP BusinessObjects BI Platform.
- MS Windows remote a desktop for easy access to the solution.
30.Mention the Deployment options for a BW ?
Ans:
SAP BW/4HANA can be deployed by managed cloud offerings such as SAP HANA Enterprise Cloud (HEC), Amazon Web Services (AWS), Google Cloud Platform (GCP) or Microsoft Azure.
31.How to use appliance?
Ans:
- Within 1-2 hours in the hosted environment, using a cloud provider infrastructure like Amazon Web Services (AWS), Microsoft Azure, or Google Cloud Platform (GCP).
- Within typically ~ 2-3 days on a own on-premise hardware that has a supported a Linux release installed (everything else will come with appliance).
32.What is content & scope ?
Ans:
The scope of appliance centers around the SAP Best Practices content for a SAP S/4HANA and is enhanced by further scenarios, also depending on a exact release version of the appliance.
33.What is Linux operating system?
Ans:
- Overview Pages
- Sell from a stock / outbound delivery processing
- Accounting & Financial Close
- Lease-in accounting
- Investments
- Universal Allocation
- Bank account management
- Overdue receivables
- Group Reporting
34.Data Tiering Optimization with a SAP BW/4HANA?
Ans:
SAP customers running larger Hana databases frequently ask a Microsoft what technologies are supported on an Azure to reduce the Hana memory footprint. SAP Hana Database by its design creates hard dependency on a physical memory and SAP Applications by their design create ever increasing a Database sizes. These two competing design principles create the challenges for large customers.
35. What are the info objects?
Ans:
Characteristics and key figures will be called as a info objects. ‘Info-objects’ are similar to the fields of the source system, data based on which organize data in various info provider in BW.
36.What is a Data volume performance in BW/4HANA?
Ans:
Data and logs of a system are stored in a volumes. … SAP HANA persists in-memory data by using a savepoints. Every SAP HANA service has its own separate save points. During savepoint operation, the SAP HANA database flushes all changed data from a memory to the data volumes.
37. What is a SAP applications and non-SAP RDBMS system?
Ans:
SAP administers ERP software which integrates a different business processes and functional areas including: manufacturing, sales, inventory control, finance and a human resources. … Any computer which has SAP software installed on it is called SAP system. Computers which don’t have a SAP software are non-SAP systems.
38.BODS – Object Hierarchy?
Ans:
Properties − They are used to explain an object and do not affect its operation.
Example − Name of object, Date when it is created, etc.
Options − Which control a operation of objects.
39.What are the BODS – Tools and Functions?
Ans:
Many tools in SAP Business Objects Data Services. Every tool has its own function as per system landscape.
40. What is modelling?
Ans:
Designing of data base is done by using a modelling. The design of DB (Data Base) depends on schema, and schema is explained as the representation of tables and their relationship.
41.The following steps are performed on a AWS?
Ans:
42.The performed on a on-premises S/HANA system?
Ans:
- The Cloud Block Store on AWS is connected to a on-premises FlashArray as an asynchronous-replication target.
- Protection group for on-premises SAP HANA’s data volume. This protection group is configured to a replicate to Cloud Block Store.
- Now take an application-consistent snapshot of a on-premises SAP HANA system. put SAP HANA in create a snapshot mode (Back up mode) and then freeze the data volume’s file system to the application-consistent snapshot. Then unfreeze the file system and close a SAP HANA backup mode.
43.What is HANA SQL DW with the Smart Data Integration?
Ans:
SAP HANA smart data integration and a SAP HANA smart data quality load data, in batch or real-time, into HANA) from a variety of sources using a pre-built and custom adapters. … However, data provisioning agent needs to be hosted on a Intel machine. It is possible to connect between a two.
44.When and why would extract from a ABAP to the SQL DW?
Ans:
The SAP HANA SQL DW provides with a complete DW toolset in a line with the SQL approach. SAP positions SQL DW next to SAP BW/4HANA, allowing to run them individually or together as a one data warehouse. Where a SAP HANA SQL DW is a relative new offering, SAP BW has long and known history. That also goes for an extraction of data from a ABAP based SAP source systems. If running that scenario with SAP BW, and see no need to change this, then are perfectly fine. The functions described here are for a customers that have a use case for direct extraction of a ABAP sources to their SAP HANA SQL DW or other use cases where data needs to land in SAP HANA.
45.How did people load from a ABAP to SQL DW before?
Ans:
- Direct extraction from a database source tables: can be the fastest data transfer option, but then bypassing an extractor logic, and as directly access the source database, might violate database license restrictions.
- Using a SAP Data Services, which supports the ODP protocol to load by the ABAP application layer.
- Smart Data Integration (SDI), which also supports a ODP protocol, but which is built straight into a SAP HANA. This is where a functionality has been extended.
46. What is an extended star schema?
Ans:
Star Schema comprises of a Fact tables and Dimension Tables, while the table that consists a Master data are kept in separate tables. These separate tables fora Master data are referred as an Extended Star Schema.
47.What is a Data Warehousing?
Ans:
Data warehousing is a technique or system that collects a transformed data from either or both homogenous and heterogeneous data sources and transfers into the single data store. It provides a data to the analytical tools. The data warehouses are known to be central repositories of a business intelligence system.
48.What are Components of Data Warehouse Architecture?
Ans:
A typical data warehouse has four major components: a central database, ETL (extract, transform, load) tools, metadata, and the access tools. All of these components are engineered for a speed so that can get results quickly and analyse data on the fly.
49.Benefits of a SAP HANA Data Warehousing?
Ans:
- Instantly access real-time or a historical data from SAP or non-SAP data sources. It accesses a data to carry out real-time analysis and business insights either on-premise or on-cloud.
- It offers a high-volume and real-time data processing.
- Utilize a most of this tool by the virtue of a Big Data warehouse. It allows to complete capitalization on the data.
- Can develop the application according to the requirements and use processed data from the data warehouse. Conduct advanced analytics and integrate with the machine learning capabilities.
50. What is ‘Fact Table’?
Ans:
Fact table is a collection of facts and relations that mean foreign keys with dimension. Actually fact table holds a transactional data.
51.What are the SAP HANA Lines of Business?
Ans:
- Automotive
- Engineering, Construction, and Operations
- Food production
- Mining
- Research and Development
- Chemicals
- Health care and hospitals
- Computer software and hardware
- Management Consulting
52.Future in a SAP HANA?
Ans:
- Although, it is a solidified notion about a SAP HANA that it is costlier than other in-memory database tools are available in the market. Nevertheless, SAP HANA remains to be a famous database technology in the market.
- SAP works meticulously on a development and evolution of the tool and releases new updates and software packages at a regular intervals.
- Although, there have come a new technologies in the market like Hadoop, Scala, Apache Spark, Hive, Pig, Python, R, etc., which serves as a fierce competition to SAP HANA in terms of being cost-effective choice for users.
- But, being the perfect and efficient technology that a SAP HANA is, it ensures a well paying and secure career if keep on upskilling in this technology and keep on learning newer and latest aspects of it.
53.What is Star Schema?
Ans:
Star schema is a backbone of all data warehouse modelling be it SAP or Oracle. It is a fairly simple concept and is really important for to any kind of analysis.
54. What are the data types for characteristics info object?
Ans:
- CHAR
- NUMC
- DATS
- TIMS
55. What Are the Options in Post-Migration?
Ans:
After migrating from one of supported source platforms, can perform a post-migration check to verify that transferred websites, email accounts, databases, and so on are available on a destination server. Websites. … Mail.
56. What is a SAP and Azure Data Factory?
Ans:
Azure Data Factory now enables to ingest data from SAP Table and SAP Business Warehouse (BW)by Open Hub by using Copy Activity. SAP Table connector supports an integrating SAP data in SAP ECC, SAP S/4HANA, and the other products in SAP Business Suite.
57.The following objects will take a part in this scenario?
Ans:
- SAP BW Composite Provider
- SAP BW External SAP HANA View
- Integration Runtime (ODBC)
- Azure Data Factory
- Azure BLOB
58. What is SAP and a Azure Analysis Services?
Ans:
Azure Analysis Services (AAS) enables to build a master semantic layer sitting on top of all the data sources. In essence, it acts as a single source of truth for all of other BI models, created by business users with the Power BI or Power Apps. Providing self-service access to a data and guided discovery, AAS empowers a teams to distill insights from a large datasets faster. This service also eliminates a need for extra data manipulations .
59.What is a SAP and Power Apps Integration?
Ans:
Power Apps is handy low-code platform that has already won over both the professional and citizen developers. With a drag & drop front-end development functionality, pre-made integrators, and reusable code snippets, Power Apps significantly reduces development time and costs for business apps.
60. What is the use of process chain?
Ans:
The use of the process chain is to an automate the data load process. It automates a process like Data load, Indices creation, Deletion, Cube compression etc. Process chains are only to load data’s.
61. What is an Azure Active Directory (AAD)?
Ans:
Azure Active Directory (Azure AD) is a Microsoft’s cloud-based identity and access a management service, which helps to employees sign in and access resources in: External resources, like Microsoft 365, the Azure portal, and thousands of the other SaaS applications.
62.SAP and Azure Active Directory features?
Ans:
- SSO is based on user’s email address or another personal identifier. Users can log in to all the Azure and SAP products without the need to create separate logins/passes.
- Single point of a control for all identity management tasks. Security, access provisioning, and de-provisioning can be performed in centralized manner.
- Multi-factor authentication can be added as extra security layer to a protect the most sensitive applications and operations.
63.What is a Microsoft Azure?
Ans:
Microsoft Azure (formerly known as Windows Azure) is cloud computing platform launched by a Microsoft. aware of the immense popularity cloud computing has gained over a time. Through cloud computing, a large and wide user base gets access to the computing resources such as servers, storage, a dedicated network, virtual machine, security etc. through the internet (cloud).
64. What are the transaction codes or a T-codes for Info-objects?
Ans:
LISTCUBE: List viewer for an InfoCubes
LISTSCHEMA: Show a InfoCube schema
RSDCUBE, RSDCUBED, RSDCUBEM: Start an InfoCube editing
65.Services Provided by a Microsoft Azure?
Ans:
- Computing.
- Networking.
- Storage.
- Migration.
- Mobile.
- Analysis.
- Containers.
66.What are the SAP HANA use cases?
Ans:
- Application and IT infrastructure security.
- Asset operation and maintenance.
- Business planning and consolidation.
- Customer Relationship Management (CRM).
- Database and data warehouse management.
- Enterprise information and performance management.
- Human resource management.
- Logistics and order fulfilment.
67.Mention the Cloud advantages?
Ans:
- Flexibility.
- Scalability.
- Hybrid integration with on-premises datacentres.
- Agile implementation of new solutions.
- Data integrity.
- Cost-effectiveness – charges based on usage.
68.Why Azure?
Ans:
- Microsoft spends a over 1 billion dollars on cyber-attack research and cybersecurity improvements.
- Azure meets requirements of over 90 compliance certifications from across various sectors and regions.
- More than 95% of Fortune 500 companies use a Microsoft cloud solutions.
69.Azure Data Factory provides a 3 kinds of connectors for connecting with the SAP BW?
Ans:
- SAP Open Hub
- SAP MDX
- SAP Table
70. What is a maximum number of key figures and characteristics?
Ans:
The maximum number of a key figures is be 233 and a characteristics are 248.
71.Mention the Features available in a ADF Pipeline activities?
Ans:
A pipeline is logical grouping of activities that performs a unit of … those processes in the scaled-out manner from an ADF pipelines.
72. SAP BW Limitations?
Ans:
- Since Open Hub does not support an Infoset, the data cannot be brought directly from an Infoset. It can be done using a another BW object like DSO or cube in between the Infoset and Open Hub to stage the Infoset.
- The output data is a flat and not be multidimensional. The data type of output cannot be changed.
- Delta loading from the BEx query to Open Hub is not possible. So, if delta is necessary, then it must be handled in a design of BEx Query.
73.What is the SQL Server and SAP HANA?
Ans:
Most RDBMS database uses a SQL as database language, the reason of being popular is – it is powerful, vendor independent and be standardized. SAP HANA also supports a SQL. In SAP HANA, SQL is the major database language.
74. Mention Development models integrating in HANA models and BW?
Ans:
There are four types of InfoObjects in a SAP BW/4HANA: characteristic, key figure, unit, and XXL.
75. How can convert an info package group into a process chain?
Ans:
Can convert a package group into a process chain by double clicking on info package group, then have to click on the ‘ Process Chain Maint ‘ button where have to type the name and description, this will insert an individual info packages automatically.
76.How Does Azure Data Factory Differ from the other ETL Tools?
Ans:
ADF, which resembles a SSIS in many aspects, is mainly used for the E-T-L, data movement and orchestration, whereas Databricks can be used for a real-time data streaming, collaboration across Data Engineers, Data Scientist and more, along with the supporting the design and development of AI and Machine Learning Models by Data.
77. Can an Info-object be Info-provider?
Ans:
Yes, info-object can be info-provider. In order to do this, have to right click on a Info Area and select ” Insert characteristics as data target”.
78.What is the ADP Integration Runtime?
Ans:
The Integration Runtime (IR) is a compute infrastructure used by Azure Data Factory and Azure Synapse pipelines to offer the following data integration capabilities across various network environments: Data Flow: Execute Data Flow in managed Azure compute environment.
79. What is a Data Capture (CDC)?
Ans:
The technology emerged two decades ago to help the replication software vendors deliver real-time transactions to a data warehouses. It works by standard change-log mechanisms to continuously identify and capture a incremental changes to data and data schemas from sources such as enterprise-grade databases. These small incremental changes are then automatically propagated to a target systems by a low-latency data transfer. The impact on an operational systems is also extremely low since a source systems don’t require integration agents or additional database queries. As a result, CDC has benefit of being easier to administer and manage than other types of a data integration. Furthermore, there’s no “batch window” and data in target systems are always up-to-date.
80.How do replicate data from a premise to Azure?
Ans:
Replication can be configured by using a SQL Server Management Studio or by executing Transact-SQL statements on publisher. cannot configure replication by using a Azure portal. Replication can only use a SQL Server authentication logins to connect to the Azure SQL Database. Replicated tables must have primary key.
81.How is Comparing Azure Data Factory and an Attunity Replicate?
Ans:
82.What is the Automated SAP HANA System Replication?
Ans:
SAP HANA System Replication is an implemented between a two different SAP HANA systems with same number of active nodes.
83. What is a Conversion Routine?
Ans:
Conversion routine is used to convert a data types from an internal format to external format or display format.
84.SAP HANA on Microsoft Azure Certification?
Ans:
- Can either a deploy an SAP HANA instance on Azure Virtual Machines which goes up to a GS-5 virtual machines (GS5 VM) with 448GB of memory.
- The other way is to deploy a SAP HANA instances on the AZURE large instances for a SAP HANA. It is a physical center that is offered in locations in data center by Azure specifications.
85. Explain the difference between a Start routine and Conversion routine?
Ans:
In ‘start routine’, can modify the data packages, when data is loading. While conversion of a routine, usually refers to routines bound to info objects for conversion of an internal and display format.
86.What is the OS is managed by a both customer and Microsoft?
Ans:
Microsoft Windows, also called a Windows and Windows OS, computer operating system (OS) developed by a Microsoft Corporation to run personal computers (PCs).
87.How to run a SAP HANA non-production environment Azure?
Ans:
- From the infrastructure wise layer given in image below are can see the first numbered segment on the left refers to Customer Datacenter with a HANA production instance deployed.
- On the bottom right side, see a number of Virtual Machines labelled in ECC Applications for QA and ECC Applications for a DEV/TEST. They are application server layers that can deployed on a Microsoft Azure Virtual Machines.
88. How to un-lock objects in a Transport Organizer?
Ans:
To unlock objects in transport organizer, go to SE03à Request a TaskàUnlock objects. When enter a request and select unlock and execute, it will unlock a request.
89.Activities that can perform with a SDA?
Ans:
Here are some of sample activities, which can be performed with HANA SDA: Consume data mart scenarios of a SAP HANA from the connected database. Consolidate a data warehouse landscape. Create the another data warehouse absolutely transparent for the SAP BW/4 HANA database of a BW system.
90.Common Use Cases for a SDA?
Ans:
The following are some of the common use cases for a SDA: The main use cases for a SAP HANA SDA is archiving data in various source and retaining active data in SAP HANA. It is possible to access IQ in the way of archived storage in the order to store real-time hot data in the HANA and cold data in IQ. Create applications by the way of running a SAP HANA models across various data sources. Overcome challenges of Big Data by linking to a Hadoop by means of Hive interface.
91. What Is Language Sap Hana Is Developed In?
Ans:
The SAP HANA database is developed in a C++.
92. What Is an Ad Hoc Analysis?
Ans:
In traditional data warehouses, such as SAP BW, lot of pre-aggregation is done for fast results. That is administrator (IT department) decides which information might be needed for the analysis and prepares the result for the end users. This results in a performance but the end user does not have a flexibility.
93. What Are The Row-based And a Column Based Approach?
Ans:
Row based tables:
- It is a traditional Relational Database approach.
- It store table in the sequence of rows.
Column based tables:
- It store a table in the sequence of columns i.e. the entries of a column is stored in a contiguous memory locations.
- SAP HANA is particularly optimized for a column-order storage.
- SAP HANA supports a both row-based and column-based approach.
94. Can Partition A Cube Which Has a Data Already?
Ans:
No; the cube must be an empty to do this; one work around is to make a copy of cube A to cube B; export data from A to B using a export data source; empty cube A; create partition on A; re-import a data from B; delete cube B.
95. What Is Dim Id?
Ans:
DIM ID: are used to connect a fact tables and dimension tables.
96. What Is Info Source?
Ans:
A structure consisting of an InfoObjects without persistence for connecting a two transformations.
97. What is update or a transfer routine?
Ans:
The update routine is used to explain a Global Data and Global Checks. They are defined as a object level. It is like a Start Routine.
98. What Is Transnational Infocube?
Ans:
These cubes are used for a both read and write; standard cubes are optimized for the reading. The transactional cubes are used in a SEM.
99. What are the transaction codes for a process chain?
Ans:
RSPC: Process a Chain Maintenance
RSPC1: Process a Chain Display
RSPCM: Monitor a daily process chains
RZ20: To see a log for process chains
100. Can Disable a Cache?
Ans:
Yes, either globally or using a query debug tool RSRT.
Are you looking training with Right Jobs?
Contact Us- Hadoop Interview Questions and Answers
- Apache Spark Tutorial
- Hadoop Mapreduce tutorial
- Apache Storm Tutorial
- Apache Spark & Scala Tutorial
Related Articles
Popular Courses
- Hadoop Developer Training
11025 Learners
- Apache Spark With Scala Training
12022 Learners
- Apache Storm Training
11141 Learners
- What is Dimension Reduction? | Know the techniques
- Difference between Data Lake vs Data Warehouse: A Complete Guide For Beginners with Best Practices
- What is Dimension Reduction? | Know the techniques
- What does the Yield keyword do and How to use Yield in python ? [ OverView ]
- Agile Sprint Planning | Everything You Need to Know