You need to create the usp.AssignUser stored procedure
DRAG DROP
###BeginCaseStudy###
Case Study: 7
Fourth Coffee
Background
Corporate Information
Fourth Coffee is global restaurant chain. There are more than 5,000 locations worldwide.
Physical Locations
Currently a server at each location hosts a SQL Server 2012 instance. Each instance contains
a database called StoreTransactions that stores all transactions from point of sale and uploads
summary batches nightly.
Each server belongs to the COFFECORP domain. Local computer accounts access the
StoreTransactions database at each store using sysadmin and datareaderwriter roles.
Planned changes
Fourth Coffee has three major initiatives:
• The FT department must consolidate the point of sales database infrastructure.
• The marketing department plans to launch a mobile application for micropayments.
• The finance department wants to deploy an internal tool that will help detect fraud.
Initially, the mobile application will allow customers to make micropayments to buy coffee
and other items on the company web site. These micropayments may be sent as gifts to other
users and redeemed within an hour of ownership transfer. Later versions will generate
profiles based on customer activity that will push texts and ads generated by an analytics
application.
When the consolidation is finished and the mobile application is in production, the
micropayments and point of sale transactions will use the same database.
Existing Environment
Existing Application Environment
Some stores have been using several pilot versions of the micropayment application. Each
version currently is in a database that is independent from the point of sales systems. Some
versions have been used in field tests at local stores, and others are hosted at corporate
servers. All pilot versions were developed by using SQL Server 2012.
Existing Support Infrastructure
The proposed database for consolidating micropayments and transactions is called
CoffeeTransactions. The database is hosted on a SQL Server 2014 Enterprise Edition
instance and has the following file structures:
Business Requirements
General Application Solution Requirements
The database infrastructure must support a phased global rollout of the micropayment
application and consolidation.
The consolidated micropayment and point of sales database will be into a CoffeeTransactions
database. The infrastructure also will include a new CoffeeAnalytics database for reporting
on content from CoffeeTransactions.
Mobile applications will interact most frequently with the micropayment database for the
following activities:
• Retrieving the current status of a micropayment;
• Modifying the status of the current micropayment; and
• Canceling the micropayment.
The mobile application will need to meet the following requirements:
• Communicate with web services that assign a new user to a micropayment by using a
stored procedure named usp_AssignUser.
• Update the location of the user by using a stored procedure named
usp_AddMobileLocation.
The fraud detection service will need to meet the following requirements:
• Query the current open micropayments for users who own multiple micropayments by
using a stored procedure named usp.LookupConcurrentUsers.
• Persist the current user locations by using a stored procedure named
usp_Mobilel_ocationSnapshot.
• Look at the status of micropayments and mark micropayments for internal
investigations.
• Move micropayments to dbo.POSException table by using a stored procedure named
ups_DetectSuspiciousActivity.
• Detect micropayments that are flagged with a StatusId value that is greater than 3 and
that occurred within the last minute.
The CoffeeAnalytics database will combine imports of the POSTransaction and
MobileLocation tables to create a UserActivity table for reports on the trends in activity.
Queries against the UserActivity table will include aggregated calculations on all columns
that are not used in filters or groupings.
Micropayments need to be updated and queried for only a week after their creation by the
mobile application or fraud detection services.
Performance
The most critical performance requirement is keeping the response time for any queries of the
POSTransaction table predictable and fast.
Web service queries will take a higher priority in performance tuning decisions over the fraud
detection agent queries.
Scalability
Queries of the user of a micropayment cannot return while the micropayment is being
updated, but can show different users during different stages of the transaction.
The fraud detection service frequently will run queries over the micropayments that occur
over different time periods that range between 30 seconds and ten minutes.
The POSTransaction table must have its structure optimized for hundreds of thousands of
active micropayments that are updated frequently.
All changes to the POSTransaction table will require testing in order to confirm the expected
throughput that will support the first year’s performance requirements.
Updates of a user’s location can tolerate some data loss.
Initial testing has determined that the POSTransaction and POSException tables will be
migrated to an in-memory optimized table.
Availability
In order to minimize disruption at local stores during consolidation, nightly processes will
restore the databases to a staging server at corporate headquarters.
Technical Requirements
Security
The sensitive nature of financial transactions in the store databases requires certification of
the COFFECORP\Auditors group at corporate that will perform audits of the data. Members
of the COFFECORP\Auditors group cannot have sysadmin or datawriter access to the
database.
Compliance requires that the data stewards have access to any restored StoreTransactions
database without changing any security settings at a database level.
Nightly batch processes are run by the services account in the COFFECORP\StoreAgent
group and need to be able to restore and verify the schema of the store databases match.
No Windows group should have more access to store databases than is necessary.
Maintainability
You need to anticipate when POSTransaction table will need index maintenance.
When the daily maintenance finishes, micropayments that are one week old must be available
for queries in UserActivity table but will be queried most frequently within their first week
and will require support for in-memory queries for data within first week.
The maintenance of the UserActivity table must allow frequent maintenance on the day’s
most recent activities with minimal impact on the use of disk space and the resources
available to queries. The processes that add data to the UserActivity table must be able to
update data from any time period, even while maintenance is running.
The index maintenance strategy for the UserActivity table must provide the optimal structure
for both maintainability and query performance.
All micropayments queries must include the most permissive isolation level available for the
maximum throughput.
In the event of unexpected results, all stored procedures must provide error messages in text
message to the calling web service.
Any modifications to stored procedures will require the minimal amount of schema changes
necessary to increase the performance.
Performance
Stress testing of the mobile application on the proposed CoffeeTransactions database
uncovered performance bottlenecks. The sys.dm_os_wait_stats Dynamic Management View
(DMV) shows high wait_time values for WRTTELOG and PAGEIOLATCHJJP wait types
when updating the MobileLocation table.
Updates to the MobileLocation table must have minimal impact on physical resources.
Supporting Infrastructure
The stored procedure usp_LookupConcurrentUsers has the current implementation:
The current stored procedure for persisting a user location is defined in the following code:
The current stored procedure for managing micropayments needing investigation is defined
in the following code:
The current table, before implementing any performance enhancements, is defined as follows:
###EndCaseStudy###
You need to create the usp.AssignUser stored procedure.
Develop the solution by selecting and arranging the required code blocks in the correct
order. You may not need all of the code blocks.
You need to ensure that the corporate logos appear in reports
###BeginCaseStudy###
Case Study: 5
Contoso, Ltd Case B
General Background
You are the business intelligence (BI) solutions architect for Contoso, Ltd, an online retailer.
You produce solutions by using SQL Server 2012 Business Intelligence edition and
Microsoft SharePoint Server 2010 Service Pack 1 (SP1) Enterprise edition.
A SharePoint farm has been installed and configured for intranet access only. An Internetfacing web server hosts the company’s public e-commerce website. Anonymous access is not
configured on the Internet-facing web server.
Data Warehouse
The data warehouse is deployed on a 5QL Server 2012 relational database instance. The data
warehouse is structured as shown in the following diagram.
The following Transact-SQL (T-SQL) script is used to create the FactSales and FactPopulation tables:
The FactPopulation table is loaded each year with data from a Windows Azure Marketplace
commercial dataset. The table contains a snapshot of the population values for all countries of
the world for each year. The world population for the last year loaded exceeds 6.8 billion people.
ETL Process
SQL Server Integration Services (SSIS) is used to load data into the data warehouse. All SSIS
projects are developed by using the project deployment model.
A package named StageFactSales loads data into a data warehouse staging table. The package
sources its data from numerous CSV files exported from a mainframe system. The CSV file
names begin with the letters GLSD followed by a unique numeric identifier that never
exceeds six digits. The data content of each CSV file is identically formatted.
A package named LoadFactFreightCosts sources data from a Windows Azure SQL Database
database that has data integrity problems. The package may retrieve duplicate rows from the database.
The package variables of all packages have the RaiseChangedEvent property set to true. A
package-level event handler for the OnVariableValueChanged event consists of an Execute
SQL task that logs the System::VariableName and System::VariableValue variables.
Data Models
SQL Server Analysis Services (SSAS) is used to host the Corporate BI multidimensional
database. The Corporate BI database contains a single data source view named Data
Warehouse. The Data Warehouse data source view consists of all data warehouse tables. All
data source view tables have been converted to named queries.
The Corporate BI database contains a single cube named Sales Analysis and three database
dimensions: Date, Customer and Product. The dimension usage for the Sales Analysis cube is
as shown in the following image.
The Customer dimension contains a single multi-level hierarchy named Geography. The
structure of the Geography hierarchy is shown in the following image.
The Sales Analysis cube’s calculation script defines one calculated measure named Sales Per Capita
The calculated measure expression divides the Revenue measure by the Population
measure and multiplies the result by 1,000. This calculation represents revenue per 1,000
people.
The Sales Analysis cube produces correct Sales Per Capita results for each country of the
world; however, the Grand Total for all countries is incorrect, as shown in the following
image (rows 2-239 have been hidden).
A role named Analysts grants Read permission for the Sales Analysis cube to all sales and
marketing analysts in the company.
SQL Server Reporting Services (SSRS) is configured in SharePoint integrated mode. All
reports are based on shared data sources.
Corporate logo images used in reports were originally configured as data-bound images
sourced from a SQL Server relational database table. The image data has been exported to
JPG files. The image files are hosted on the Internet-facing web server. All reports have been
modified to reference the corporate logo images by using the fully qualified URLs of the
image files. A red X currently appears in place of the corporate logo in reports.
Users configure data alerts on certain reports. Users can view a report named Sales
Profitability on demand; however, notification email messages are no longer being sent when
Sales Profitability report data satisfies alert definition rules. The alert schedule settings for the
Sales Profitability report are configured as shown in the following image.
Business Requirements
Data Models
Users must be able to:
• Provide context to measures and filter measures by using all related data warehouse
dimensions.
• Analyze measures by order date or ship date.
Additionally, users must be able to add a measure named Sales to the report canvas by
clicking only once in the Power View field list. The Sales measure must allow users to
analyze the sum of the values in the Revenue column of the FactSales data warehouse table.
Users must be able to change the aggregation function of the Sales measure.
Analysis and Reporting
A sales manager has requested the following query results from the Sales Analysis cube for
the 2012 fiscal year:
• Australian postal codes and sales in descending order of sales.
• Australian states and the ratio of sales achieved by the 10 highest customer sales made
for each city in that state.
Technical Requirements
ETL Processes
If an SSIS package variable value changes, the package must log the variable name and the
new variable value to a custom log table.
The StageFactSales package must load the contents of all files that match the file name
pattern. The source file name must also be stored in a column of the data warehouse staging
table.
In the design of the LoadFactSales package, if a lookup of the dimension surrogate key value
for the product code fails, the row details must be emailed to the data steward and written as
an error message to the SSIS catalog log by using the public API.
You must configure the LoadFactFreightCosts package to remove duplicate rows, by using
the least development effort.
Data Models
Users of the Sales Analysis cube frequently filter on the current month’s data. You must
ensure that queries to the Sales Analysis cube default to the current month in the Order Date
dimension for all users.
You must develop and deploy a tabular project for the exclusive use as a Power View
reporting data source. The model must be based on the data warehouse. Model table names
must exclude the Dim or Fact prefixes. All measures in the model must format values to
display zero decimal places.
Analysis and Reporting
Reports must be developed that combine the SSIS catalog log messages with the package
variable value changes.
###EndCaseStudy###
You need to ensure that the corporate logos appear in reports.
What should you do?
Which components should you use?
DRAG DROP
###BeginCaseStudy###
Case Study: 5
Contoso, Ltd Case B
General Background
You are the business intelligence (BI) solutions architect for Contoso, Ltd, an online retailer.
You produce solutions by using SQL Server 2012 Business Intelligence edition and
Microsoft SharePoint Server 2010 Service Pack 1 (SP1) Enterprise edition.
A SharePoint farm has been installed and configured for intranet access only. An Internetfacing web server hosts the company’s public e-commerce website. Anonymous access is not
configured on the Internet-facing web server.
Data Warehouse
The data warehouse is deployed on a 5QL Server 2012 relational database instance. The data
warehouse is structured as shown in the following diagram.
The following Transact-SQL (T-SQL) script is used to create the FactSales and FactPopulation tables:
The FactPopulation table is loaded each year with data from a Windows Azure Marketplace
commercial dataset. The table contains a snapshot of the population values for all countries of
the world for each year. The world population for the last year loaded exceeds 6.8 billion people.
ETL Process
SQL Server Integration Services (SSIS) is used to load data into the data warehouse. All SSIS
projects are developed by using the project deployment model.
A package named StageFactSales loads data into a data warehouse staging table. The package
sources its data from numerous CSV files exported from a mainframe system. The CSV file
names begin with the letters GLSD followed by a unique numeric identifier that never
exceeds six digits. The data content of each CSV file is identically formatted.
A package named LoadFactFreightCosts sources data from a Windows Azure SQL Database
database that has data integrity problems. The package may retrieve duplicate rows from the database.
The package variables of all packages have the RaiseChangedEvent property set to true. A
package-level event handler for the OnVariableValueChanged event consists of an Execute
SQL task that logs the System::VariableName and System::VariableValue variables.
Data Models
SQL Server Analysis Services (SSAS) is used to host the Corporate BI multidimensional
database. The Corporate BI database contains a single data source view named Data
Warehouse. The Data Warehouse data source view consists of all data warehouse tables. All
data source view tables have been converted to named queries.
The Corporate BI database contains a single cube named Sales Analysis and three database
dimensions: Date, Customer and Product. The dimension usage for the Sales Analysis cube is
as shown in the following image.
The Customer dimension contains a single multi-level hierarchy named Geography. The
structure of the Geography hierarchy is shown in the following image.
The Sales Analysis cube’s calculation script defines one calculated measure named Sales Per Capita
The calculated measure expression divides the Revenue measure by the Population
measure and multiplies the result by 1,000. This calculation represents revenue per 1,000
people.
The Sales Analysis cube produces correct Sales Per Capita results for each country of the
world; however, the Grand Total for all countries is incorrect, as shown in the following
image (rows 2-239 have been hidden).
A role named Analysts grants Read permission for the Sales Analysis cube to all sales and
marketing analysts in the company.
SQL Server Reporting Services (SSRS) is configured in SharePoint integrated mode. All
reports are based on shared data sources.
Corporate logo images used in reports were originally configured as data-bound images
sourced from a SQL Server relational database table. The image data has been exported to
JPG files. The image files are hosted on the Internet-facing web server. All reports have been
modified to reference the corporate logo images by using the fully qualified URLs of the
image files. A red X currently appears in place of the corporate logo in reports.
Users configure data alerts on certain reports. Users can view a report named Sales
Profitability on demand; however, notification email messages are no longer being sent when
Sales Profitability report data satisfies alert definition rules. The alert schedule settings for the
Sales Profitability report are configured as shown in the following image.
Business Requirements
Data Models
Users must be able to:
• Provide context to measures and filter measures by using all related data warehouse
dimensions.
• Analyze measures by order date or ship date.
Additionally, users must be able to add a measure named Sales to the report canvas by
clicking only once in the Power View field list. The Sales measure must allow users to
analyze the sum of the values in the Revenue column of the FactSales data warehouse table.
Users must be able to change the aggregation function of the Sales measure.
Analysis and Reporting
A sales manager has requested the following query results from the Sales Analysis cube for
the 2012 fiscal year:
• Australian postal codes and sales in descending order of sales.
• Australian states and the ratio of sales achieved by the 10 highest customer sales made
for each city in that state.
Technical Requirements
ETL Processes
If an SSIS package variable value changes, the package must log the variable name and the
new variable value to a custom log table.
The StageFactSales package must load the contents of all files that match the file name
pattern. The source file name must also be stored in a column of the data warehouse staging
table.
In the design of the LoadFactSales package, if a lookup of the dimension surrogate key value
for the product code fails, the row details must be emailed to the data steward and written as
an error message to the SSIS catalog log by using the public API.
You must configure the LoadFactFreightCosts package to remove duplicate rows, by using
the least development effort.
Data Models
Users of the Sales Analysis cube frequently filter on the current month’s data. You must
ensure that queries to the Sales Analysis cube default to the current month in the Order Date
dimension for all users.
You must develop and deploy a tabular project for the exclusive use as a Power View
reporting data source. The model must be based on the data warehouse. Model table names
must exclude the Dim or Fact prefixes. All measures in the model must format values to
display zero decimal places.
Analysis and Reporting
Reports must be developed that combine the SSIS catalog log messages with the package
variable value changes.
###EndCaseStudy###
You need to develop the LoadFactSales package to write the error messages to the SSIS catalog log.
Which components should you use?
To answer, drag the appropriate components to the correct location or locations in the
answer area. (Use only components that apply.)
You need to configure per-user security authentication for reporting against the Sales cube
###BeginCaseStudy###
Case Study: 2
Contoso, Ltd
Background
You are the business intelligence (BI) solutions architect for Contoso Ltd, a multinational
sales company with offices in London, Madrid, Paris, Brisbane, Tokyo, and New York.
Contoso sells office consumable products such as pens, printer ink, and paper.
You produce solutions by using SQL Server 2012 Business Intelligence Edition and
Microsoft SharePoint Server 2010 Enterprise Edition with SP1.
Technical Background
Contoso’s products are categorized by using four levels while some use only two or three
levels. Products are categorized as shown in the following table.
Contoso sells products through mobile sales staff, direct marketing, and its website. Sales
personnel are located in various regions around the world, and each region has a sales
manager who is paid a quarterly bonus based on the total sales in the region during the
quarter. Regions are categorized as shown in the following table.
SQL Server Analysis Services (SSAS) is used to host a multidimensional database. The
database contains a single cube named Sales and three database dimensions named Products,
Regions, and Date. A single measure named Sales Total has been defined in the cube. The
data source for the database is a SQL Server data warehouse.
The Products dimension contains a single user-defined hierarchy named Products. To prevent
the display of empty members when users browse the Products dimension, the Extract,
Transform, and Load (ETL) process populates all missing values as shown in the following diagram.
The structure of the Products hierarchy is shown in the following diagram.
The Regions dimension contains a single user-defined hierarchy named Sales Regions. The
dimension is based on a single dimension table in the data warehouse and the attribute
relationships have not been modified since the dimension was created by using the
Dimension wizard. The structure of the Sales Regions hierarchy is shown in the following diagram.
The Date dimension contains a single user-defined hierarchy named Calendar. The structure
of the Calendar hierarchyis shown in the following diagram.
A role named UserRegions has been created in the SSAS database that will be used to filter
members in the Regions dimension based on the authenticated user.
Administrative staff from around the world will produce sales reports with Microsoft Excel
2010 based on the Sales cube.
Developers will produce reports with SQL Server Reporting Services (SSRS) based on the
Sales cube and the reports will be delivered to users through report subscriptions and a web
browser-All users log on to an Active Directory Domain Services (AD DS) domain named
contoso.com.
All client computers and servers are joined to the contoso.com domain.
Business Requirements
The BI system must meet the following reporting requirements:
• Display all sales figures in euro currency, regardless of the client’s reporting location
• Include a new measure named AD Sales that calculates average daily sales for a
selected month
• Support near real-time reporting while maintaining good performance for
multidimensional queries
• Support reports that show currency exchange rates
• Deliver executive reports that are parameterized and rendered from report snapshots
In addition, cube objects must use terms familiar to users from around the world. For
example, in the SalesRegions hierarchy, users from Great Britain must see the State level
presented as County when browsing the Sales cube.
The Sales cube must support a new measure group named Sales Planning. The measure group
must consist of a single measure named Sales Plan that enables the management team to use
Excel 2010 to enter sales plans for future monitoring.
The BI system must meet the following technical requirements:
Architecture requirements
o The system must use separate servers for each of the following components:
SQL Server Database Engine
SQL Server Integration Services
SQL Server Analysis Services in multidimensional mode SharePoint Server with
the Reporting Services Add-in
o All servers must be installed using U.S. regional settings.
o The system must source currency exchange rate data from a database hosted in
Microsoft Azure SQL Database.
Security requirements
o When possible, the system must use Windows authentication for all database
connections.
o The system must prevent users from querying data from outside of their region.
o The system must allow certain users to query data from multiple regions.
Development requirements
o When browsing the Products hierarchy, repeating values for different levels of a given
drill-path must be avoided. For example, Papers -> Copy Paper -> Copy Paper ->
Copy Paper should appear simply as Papers -> Copy Paper.
o The system must support report snapshots. The default maximum number of retained
snapshots must not exceed five.
###EndCaseStudy###
You need to configure per-user security authentication for reporting against the Sales cube.
What should you do? (Each correct answer presents part of the complete solution. Choose
all that apply.)
You need to modify the stored procedure usp_LookupConcurrentUsers
###BeginCaseStudy###
Case Study: 7
Fourth Coffee
Background
Corporate Information
Fourth Coffee is global restaurant chain. There are more than 5,000 locations worldwide.
Physical Locations
Currently a server at each location hosts a SQL Server 2012 instance. Each instance contains
a database called StoreTransactions that stores all transactions from point of sale and uploads
summary batches nightly.
Each server belongs to the COFFECORP domain. Local computer accounts access the
StoreTransactions database at each store using sysadmin and datareaderwriter roles.
Planned changes
Fourth Coffee has three major initiatives:
• The FT department must consolidate the point of sales database infrastructure.
• The marketing department plans to launch a mobile application for micropayments.
• The finance department wants to deploy an internal tool that will help detect fraud.
Initially, the mobile application will allow customers to make micropayments to buy coffee
and other items on the company web site. These micropayments may be sent as gifts to other
users and redeemed within an hour of ownership transfer. Later versions will generate
profiles based on customer activity that will push texts and ads generated by an analytics
application.
When the consolidation is finished and the mobile application is in production, the
micropayments and point of sale transactions will use the same database.
Existing Environment
Existing Application Environment
Some stores have been using several pilot versions of the micropayment application. Each
version currently is in a database that is independent from the point of sales systems. Some
versions have been used in field tests at local stores, and others are hosted at corporate
servers. All pilot versions were developed by using SQL Server 2012.
Existing Support Infrastructure
The proposed database for consolidating micropayments and transactions is called
CoffeeTransactions. The database is hosted on a SQL Server 2014 Enterprise Edition
instance and has the following file structures:
Business Requirements
General Application Solution Requirements
The database infrastructure must support a phased global rollout of the micropayment
application and consolidation.
The consolidated micropayment and point of sales database will be into a CoffeeTransactions
database. The infrastructure also will include a new CoffeeAnalytics database for reporting
on content from CoffeeTransactions.
Mobile applications will interact most frequently with the micropayment database for the
following activities:
• Retrieving the current status of a micropayment;
• Modifying the status of the current micropayment; and
• Canceling the micropayment.
The mobile application will need to meet the following requirements:
• Communicate with web services that assign a new user to a micropayment by using a
stored procedure named usp_AssignUser.
• Update the location of the user by using a stored procedure named
usp_AddMobileLocation.
The fraud detection service will need to meet the following requirements:
• Query the current open micropayments for users who own multiple micropayments by
using a stored procedure named usp.LookupConcurrentUsers.
• Persist the current user locations by using a stored procedure named
usp_Mobilel_ocationSnapshot.
• Look at the status of micropayments and mark micropayments for internal
investigations.
• Move micropayments to dbo.POSException table by using a stored procedure named
ups_DetectSuspiciousActivity.
• Detect micropayments that are flagged with a StatusId value that is greater than 3 and
that occurred within the last minute.
The CoffeeAnalytics database will combine imports of the POSTransaction and
MobileLocation tables to create a UserActivity table for reports on the trends in activity.
Queries against the UserActivity table will include aggregated calculations on all columns
that are not used in filters or groupings.
Micropayments need to be updated and queried for only a week after their creation by the
mobile application or fraud detection services.
Performance
The most critical performance requirement is keeping the response time for any queries of the
POSTransaction table predictable and fast.
Web service queries will take a higher priority in performance tuning decisions over the fraud
detection agent queries.
Scalability
Queries of the user of a micropayment cannot return while the micropayment is being
updated, but can show different users during different stages of the transaction.
The fraud detection service frequently will run queries over the micropayments that occur
over different time periods that range between 30 seconds and ten minutes.
The POSTransaction table must have its structure optimized for hundreds of thousands of
active micropayments that are updated frequently.
All changes to the POSTransaction table will require testing in order to confirm the expected
throughput that will support the first year’s performance requirements.
Updates of a user’s location can tolerate some data loss.
Initial testing has determined that the POSTransaction and POSException tables will be
migrated to an in-memory optimized table.
Availability
In order to minimize disruption at local stores during consolidation, nightly processes will
restore the databases to a staging server at corporate headquarters.
Technical Requirements
Security
The sensitive nature of financial transactions in the store databases requires certification of
the COFFECORP\Auditors group at corporate that will perform audits of the data. Members
of the COFFECORP\Auditors group cannot have sysadmin or datawriter access to the
database.
Compliance requires that the data stewards have access to any restored StoreTransactions
database without changing any security settings at a database level.
Nightly batch processes are run by the services account in the COFFECORP\StoreAgent
group and need to be able to restore and verify the schema of the store databases match.
No Windows group should have more access to store databases than is necessary.
Maintainability
You need to anticipate when POSTransaction table will need index maintenance.
When the daily maintenance finishes, micropayments that are one week old must be available
for queries in UserActivity table but will be queried most frequently within their first week
and will require support for in-memory queries for data within first week.
The maintenance of the UserActivity table must allow frequent maintenance on the day’s
most recent activities with minimal impact on the use of disk space and the resources
available to queries. The processes that add data to the UserActivity table must be able to
update data from any time period, even while maintenance is running.
The index maintenance strategy for the UserActivity table must provide the optimal structure
for both maintainability and query performance.
All micropayments queries must include the most permissive isolation level available for the
maximum throughput.
In the event of unexpected results, all stored procedures must provide error messages in text
message to the calling web service.
Any modifications to stored procedures will require the minimal amount of schema changes
necessary to increase the performance.
Performance
Stress testing of the mobile application on the proposed CoffeeTransactions database
uncovered performance bottlenecks. The sys.dm_os_wait_stats Dynamic Management View
(DMV) shows high wait_time values for WRTTELOG and PAGEIOLATCHJJP wait types
when updating the MobileLocation table.
Updates to the MobileLocation table must have minimal impact on physical resources.
Supporting Infrastructure
The stored procedure usp_LookupConcurrentUsers has the current implementation:
The current stored procedure for persisting a user location is defined in the following code:
The current stored procedure for managing micropayments needing investigation is defined
in the following code:
The current table, before implementing any performance enhancements, is defined as follows:
###EndCaseStudy###
You need to modify the stored procedure usp_LookupConcurrentUsers.
What should you do?
Which task should you use with each maintenance step?
DRAG DROP
###BeginCaseStudy###
Case Study: 7
Fourth Coffee
Background
Corporate Information
Fourth Coffee is global restaurant chain. There are more than 5,000 locations worldwide.
Physical Locations
Currently a server at each location hosts a SQL Server 2012 instance. Each instance contains
a database called StoreTransactions that stores all transactions from point of sale and uploads
summary batches nightly.
Each server belongs to the COFFECORP domain. Local computer accounts access the
StoreTransactions database at each store using sysadmin and datareaderwriter roles.
Planned changes
Fourth Coffee has three major initiatives:
• The FT department must consolidate the point of sales database infrastructure.
• The marketing department plans to launch a mobile application for micropayments.
• The finance department wants to deploy an internal tool that will help detect fraud.
Initially, the mobile application will allow customers to make micropayments to buy coffee
and other items on the company web site. These micropayments may be sent as gifts to other
users and redeemed within an hour of ownership transfer. Later versions will generate
profiles based on customer activity that will push texts and ads generated by an analytics
application.
When the consolidation is finished and the mobile application is in production, the
micropayments and point of sale transactions will use the same database.
Existing Environment
Existing Application Environment
Some stores have been using several pilot versions of the micropayment application. Each
version currently is in a database that is independent from the point of sales systems. Some
versions have been used in field tests at local stores, and others are hosted at corporate
servers. All pilot versions were developed by using SQL Server 2012.
Existing Support Infrastructure
The proposed database for consolidating micropayments and transactions is called
CoffeeTransactions. The database is hosted on a SQL Server 2014 Enterprise Edition
instance and has the following file structures:
Business Requirements
General Application Solution Requirements
The database infrastructure must support a phased global rollout of the micropayment
application and consolidation.
The consolidated micropayment and point of sales database will be into a CoffeeTransactions
database. The infrastructure also will include a new CoffeeAnalytics database for reporting
on content from CoffeeTransactions.
Mobile applications will interact most frequently with the micropayment database for the
following activities:
• Retrieving the current status of a micropayment;
• Modifying the status of the current micropayment; and
• Canceling the micropayment.
The mobile application will need to meet the following requirements:
• Communicate with web services that assign a new user to a micropayment by using a
stored procedure named usp_AssignUser.
• Update the location of the user by using a stored procedure named
usp_AddMobileLocation.
The fraud detection service will need to meet the following requirements:
• Query the current open micropayments for users who own multiple micropayments by
using a stored procedure named usp.LookupConcurrentUsers.
• Persist the current user locations by using a stored procedure named
usp_Mobilel_ocationSnapshot.
• Look at the status of micropayments and mark micropayments for internal
investigations.
• Move micropayments to dbo.POSException table by using a stored procedure named
ups_DetectSuspiciousActivity.
• Detect micropayments that are flagged with a StatusId value that is greater than 3 and
that occurred within the last minute.
The CoffeeAnalytics database will combine imports of the POSTransaction and
MobileLocation tables to create a UserActivity table for reports on the trends in activity.
Queries against the UserActivity table will include aggregated calculations on all columns
that are not used in filters or groupings.
Micropayments need to be updated and queried for only a week after their creation by the
mobile application or fraud detection services.
Performance
The most critical performance requirement is keeping the response time for any queries of the
POSTransaction table predictable and fast.
Web service queries will take a higher priority in performance tuning decisions over the fraud
detection agent queries.
Scalability
Queries of the user of a micropayment cannot return while the micropayment is being
updated, but can show different users during different stages of the transaction.
The fraud detection service frequently will run queries over the micropayments that occur
over different time periods that range between 30 seconds and ten minutes.
The POSTransaction table must have its structure optimized for hundreds of thousands of
active micropayments that are updated frequently.
All changes to the POSTransaction table will require testing in order to confirm the expected
throughput that will support the first year’s performance requirements.
Updates of a user’s location can tolerate some data loss.
Initial testing has determined that the POSTransaction and POSException tables will be
migrated to an in-memory optimized table.
Availability
In order to minimize disruption at local stores during consolidation, nightly processes will
restore the databases to a staging server at corporate headquarters.
Technical Requirements
Security
The sensitive nature of financial transactions in the store databases requires certification of
the COFFECORP\Auditors group at corporate that will perform audits of the data. Members
of the COFFECORP\Auditors group cannot have sysadmin or datawriter access to the
database.
Compliance requires that the data stewards have access to any restored StoreTransactions
database without changing any security settings at a database level.
Nightly batch processes are run by the services account in the COFFECORP\StoreAgent
group and need to be able to restore and verify the schema of the store databases match.
No Windows group should have more access to store databases than is necessary.
Maintainability
You need to anticipate when POSTransaction table will need index maintenance.
When the daily maintenance finishes, micropayments that are one week old must be available
for queries in UserActivity table but will be queried most frequently within their first week
and will require support for in-memory queries for data within first week.
The maintenance of the UserActivity table must allow frequent maintenance on the day’s
most recent activities with minimal impact on the use of disk space and the resources
available to queries. The processes that add data to the UserActivity table must be able to
update data from any time period, even while maintenance is running.
The index maintenance strategy for the UserActivity table must provide the optimal structure
for both maintainability and query performance.
All micropayments queries must include the most permissive isolation level available for the
maximum throughput.
In the event of unexpected results, all stored procedures must provide error messages in text
message to the calling web service.
Any modifications to stored procedures will require the minimal amount of schema changes
necessary to increase the performance.
Performance
Stress testing of the mobile application on the proposed CoffeeTransactions database
uncovered performance bottlenecks. The sys.dm_os_wait_stats Dynamic Management View
(DMV) shows high wait_time values for WRTTELOG and PAGEIOLATCHJJP wait types
when updating the MobileLocation table.
Updates to the MobileLocation table must have minimal impact on physical resources.
Supporting Infrastructure
The stored procedure usp_LookupConcurrentUsers has the current implementation:
The current stored procedure for persisting a user location is defined in the following code:
The current stored procedure for managing micropayments needing investigation is defined
in the following code:
The current table, before implementing any performance enhancements, is defined as follows:
###EndCaseStudy###
You need to optimize the index and table structures for POSTransaction.
Which task should you use with each maintenance step? To answer, drag the appropriate
tasks to the correct maintenance steps. Each task may be used once, more than once, or
not at all. You may need to drag the split bar between panes or scroll to view content.
You need to redesign the system to meet the scalability requirements of the application
DRAG DROP
###BeginCaseStudy###
Case Study: 7
Fourth Coffee
Background
Corporate Information
Fourth Coffee is global restaurant chain. There are more than 5,000 locations worldwide.
Physical Locations
Currently a server at each location hosts a SQL Server 2012 instance. Each instance contains
a database called StoreTransactions that stores all transactions from point of sale and uploads
summary batches nightly.
Each server belongs to the COFFECORP domain. Local computer accounts access the
StoreTransactions database at each store using sysadmin and datareaderwriter roles.
Planned changes
Fourth Coffee has three major initiatives:
• The FT department must consolidate the point of sales database infrastructure.
• The marketing department plans to launch a mobile application for micropayments.
• The finance department wants to deploy an internal tool that will help detect fraud.
Initially, the mobile application will allow customers to make micropayments to buy coffee
and other items on the company web site. These micropayments may be sent as gifts to other
users and redeemed within an hour of ownership transfer. Later versions will generate
profiles based on customer activity that will push texts and ads generated by an analytics
application.
When the consolidation is finished and the mobile application is in production, the
micropayments and point of sale transactions will use the same database.
Existing Environment
Existing Application Environment
Some stores have been using several pilot versions of the micropayment application. Each
version currently is in a database that is independent from the point of sales systems. Some
versions have been used in field tests at local stores, and others are hosted at corporate
servers. All pilot versions were developed by using SQL Server 2012.
Existing Support Infrastructure
The proposed database for consolidating micropayments and transactions is called
CoffeeTransactions. The database is hosted on a SQL Server 2014 Enterprise Edition
instance and has the following file structures:
Business Requirements
General Application Solution Requirements
The database infrastructure must support a phased global rollout of the micropayment
application and consolidation.
The consolidated micropayment and point of sales database will be into a CoffeeTransactions
database. The infrastructure also will include a new CoffeeAnalytics database for reporting
on content from CoffeeTransactions.
Mobile applications will interact most frequently with the micropayment database for the
following activities:
• Retrieving the current status of a micropayment;
• Modifying the status of the current micropayment; and
• Canceling the micropayment.
The mobile application will need to meet the following requirements:
• Communicate with web services that assign a new user to a micropayment by using a
stored procedure named usp_AssignUser.
• Update the location of the user by using a stored procedure named
usp_AddMobileLocation.
The fraud detection service will need to meet the following requirements:
• Query the current open micropayments for users who own multiple micropayments by
using a stored procedure named usp.LookupConcurrentUsers.
• Persist the current user locations by using a stored procedure named
usp_Mobilel_ocationSnapshot.
• Look at the status of micropayments and mark micropayments for internal
investigations.
• Move micropayments to dbo.POSException table by using a stored procedure named
ups_DetectSuspiciousActivity.
• Detect micropayments that are flagged with a StatusId value that is greater than 3 and
that occurred within the last minute.
The CoffeeAnalytics database will combine imports of the POSTransaction and
MobileLocation tables to create a UserActivity table for reports on the trends in activity.
Queries against the UserActivity table will include aggregated calculations on all columns
that are not used in filters or groupings.
Micropayments need to be updated and queried for only a week after their creation by the
mobile application or fraud detection services.
Performance
The most critical performance requirement is keeping the response time for any queries of the
POSTransaction table predictable and fast.
Web service queries will take a higher priority in performance tuning decisions over the fraud
detection agent queries.
Scalability
Queries of the user of a micropayment cannot return while the micropayment is being
updated, but can show different users during different stages of the transaction.
The fraud detection service frequently will run queries over the micropayments that occur
over different time periods that range between 30 seconds and ten minutes.
The POSTransaction table must have its structure optimized for hundreds of thousands of
active micropayments that are updated frequently.
All changes to the POSTransaction table will require testing in order to confirm the expected
throughput that will support the first year’s performance requirements.
Updates of a user’s location can tolerate some data loss.
Initial testing has determined that the POSTransaction and POSException tables will be
migrated to an in-memory optimized table.
Availability
In order to minimize disruption at local stores during consolidation, nightly processes will
restore the databases to a staging server at corporate headquarters.
Technical Requirements
Security
The sensitive nature of financial transactions in the store databases requires certification of
the COFFECORP\Auditors group at corporate that will perform audits of the data. Members
of the COFFECORP\Auditors group cannot have sysadmin or datawriter access to the
database.
Compliance requires that the data stewards have access to any restored StoreTransactions
database without changing any security settings at a database level.
Nightly batch processes are run by the services account in the COFFECORP\StoreAgent
group and need to be able to restore and verify the schema of the store databases match.
No Windows group should have more access to store databases than is necessary.
Maintainability
You need to anticipate when POSTransaction table will need index maintenance.
When the daily maintenance finishes, micropayments that are one week old must be available
for queries in UserActivity table but will be queried most frequently within their first week
and will require support for in-memory queries for data within first week.
The maintenance of the UserActivity table must allow frequent maintenance on the day’s
most recent activities with minimal impact on the use of disk space and the resources
available to queries. The processes that add data to the UserActivity table must be able to
update data from any time period, even while maintenance is running.
The index maintenance strategy for the UserActivity table must provide the optimal structure
for both maintainability and query performance.
All micropayments queries must include the most permissive isolation level available for the
maximum throughput.
In the event of unexpected results, all stored procedures must provide error messages in text
message to the calling web service.
Any modifications to stored procedures will require the minimal amount of schema changes
necessary to increase the performance.
Performance
Stress testing of the mobile application on the proposed CoffeeTransactions database
uncovered performance bottlenecks. The sys.dm_os_wait_stats Dynamic Management View
(DMV) shows high wait_time values for WRTTELOG and PAGEIOLATCHJJP wait types
when updating the MobileLocation table.
Updates to the MobileLocation table must have minimal impact on physical resources.
Supporting Infrastructure
The stored procedure usp_LookupConcurrentUsers has the current implementation:
The current stored procedure for persisting a user location is defined in the following code:
The current stored procedure for managing micropayments needing investigation is defined
in the following code:
The current table, before implementing any performance enhancements, is defined as follows:
###EndCaseStudy###
You need to redesign the system to meet the scalability requirements of the application.
Develop the solution by se lecting and arranging the required code blocks in the correct order.
You may not need all of the code blocks.
Where should you set the default member Multidimensional Expressions (MDX) expression?
###BeginCaseStudy###
Case Study: 5
Contoso, Ltd Case B
General Background
You are the business intelligence (BI) solutions architect for Contoso, Ltd, an online retailer.
You produce solutions by using SQL Server 2012 Business Intelligence edition and
Microsoft SharePoint Server 2010 Service Pack 1 (SP1) Enterprise edition.
A SharePoint farm has been installed and configured for intranet access only. An Internetfacing web server hosts the company’s public e-commerce website. Anonymous access is not
configured on the Internet-facing web server.
Data Warehouse
The data warehouse is deployed on a 5QL Server 2012 relational database instance. The data
warehouse is structured as shown in the following diagram.
The following Transact-SQL (T-SQL) script is used to create the FactSales and FactPopulation tables:
The FactPopulation table is loaded each year with data from a Windows Azure Marketplace
commercial dataset. The table contains a snapshot of the population values for all countries of
the world for each year. The world population for the last year loaded exceeds 6.8 billion people.
ETL Process
SQL Server Integration Services (SSIS) is used to load data into the data warehouse. All SSIS
projects are developed by using the project deployment model.
A package named StageFactSales loads data into a data warehouse staging table. The package
sources its data from numerous CSV files exported from a mainframe system. The CSV file
names begin with the letters GLSD followed by a unique numeric identifier that never
exceeds six digits. The data content of each CSV file is identically formatted.
A package named LoadFactFreightCosts sources data from a Windows Azure SQL Database
database that has data integrity problems. The package may retrieve duplicate rows from the database.
The package variables of all packages have the RaiseChangedEvent property set to true. A
package-level event handler for the OnVariableValueChanged event consists of an Execute
SQL task that logs the System::VariableName and System::VariableValue variables.
Data Models
SQL Server Analysis Services (SSAS) is used to host the Corporate BI multidimensional
database. The Corporate BI database contains a single data source view named Data
Warehouse. The Data Warehouse data source view consists of all data warehouse tables. All
data source view tables have been converted to named queries.
The Corporate BI database contains a single cube named Sales Analysis and three database
dimensions: Date, Customer and Product. The dimension usage for the Sales Analysis cube is
as shown in the following image.
The Customer dimension contains a single multi-level hierarchy named Geography. The
structure of the Geography hierarchy is shown in the following image.
The Sales Analysis cube’s calculation script defines one calculated measure named Sales Per Capita
The calculated measure expression divides the Revenue measure by the Population
measure and multiplies the result by 1,000. This calculation represents revenue per 1,000
people.
The Sales Analysis cube produces correct Sales Per Capita results for each country of the
world; however, the Grand Total for all countries is incorrect, as shown in the following
image (rows 2-239 have been hidden).
A role named Analysts grants Read permission for the Sales Analysis cube to all sales and
marketing analysts in the company.
SQL Server Reporting Services (SSRS) is configured in SharePoint integrated mode. All
reports are based on shared data sources.
Corporate logo images used in reports were originally configured as data-bound images
sourced from a SQL Server relational database table. The image data has been exported to
JPG files. The image files are hosted on the Internet-facing web server. All reports have been
modified to reference the corporate logo images by using the fully qualified URLs of the
image files. A red X currently appears in place of the corporate logo in reports.
Users configure data alerts on certain reports. Users can view a report named Sales
Profitability on demand; however, notification email messages are no longer being sent when
Sales Profitability report data satisfies alert definition rules. The alert schedule settings for the
Sales Profitability report are configured as shown in the following image.
Business Requirements
Data Models
Users must be able to:
• Provide context to measures and filter measures by using all related data warehouse
dimensions.
• Analyze measures by order date or ship date.
Additionally, users must be able to add a measure named Sales to the report canvas by
clicking only once in the Power View field list. The Sales measure must allow users to
analyze the sum of the values in the Revenue column of the FactSales data warehouse table.
Users must be able to change the aggregation function of the Sales measure.
Analysis and Reporting
A sales manager has requested the following query results from the Sales Analysis cube for
the 2012 fiscal year:
• Australian postal codes and sales in descending order of sales.
• Australian states and the ratio of sales achieved by the 10 highest customer sales made
for each city in that state.
Technical Requirements
ETL Processes
If an SSIS package variable value changes, the package must log the variable name and the
new variable value to a custom log table.
The StageFactSales package must load the contents of all files that match the file name
pattern. The source file name must also be stored in a column of the data warehouse staging
table.
In the design of the LoadFactSales package, if a lookup of the dimension surrogate key value
for the product code fails, the row details must be emailed to the data steward and written as
an error message to the SSIS catalog log by using the public API.
You must configure the LoadFactFreightCosts package to remove duplicate rows, by using
the least development effort.
Data Models
Users of the Sales Analysis cube frequently filter on the current month’s data. You must
ensure that queries to the Sales Analysis cube default to the current month in the Order Date
dimension for all users.
You must develop and deploy a tabular project for the exclusive use as a Power View
reporting data source. The model must be based on the data warehouse. Model table names
must exclude the Dim or Fact prefixes. All measures in the model must format values to
display zero decimal places.
Analysis and Reporting
Reports must be developed that combine the SSIS catalog log messages with the package
variable value changes.
###EndCaseStudy###
You need to ensure that queries to the Sales Analysis cube default to the correct time period.
Where should you set the default member Multidimensional Expressions (MDX) expression?
Which data flow component should you use?
DRAG DROP
###BeginCaseStudy###
Case Study: 5
Contoso, Ltd Case B
General Background
You are the business intelligence (BI) solutions architect for Contoso, Ltd, an online retailer.
You produce solutions by using SQL Server 2012 Business Intelligence edition and
Microsoft SharePoint Server 2010 Service Pack 1 (SP1) Enterprise edition.
A SharePoint farm has been installed and configured for intranet access only. An Internetfacing web server hosts the company’s public e-commerce website. Anonymous access is not
configured on the Internet-facing web server.
Data Warehouse
The data warehouse is deployed on a 5QL Server 2012 relational database instance. The data
warehouse is structured as shown in the following diagram.
The following Transact-SQL (T-SQL) script is used to create the FactSales and FactPopulation tables:
The FactPopulation table is loaded each year with data from a Windows Azure Marketplace
commercial dataset. The table contains a snapshot of the population values for all countries of
the world for each year. The world population for the last year loaded exceeds 6.8 billion people.
ETL Process
SQL Server Integration Services (SSIS) is used to load data into the data warehouse. All SSIS
projects are developed by using the project deployment model.
A package named StageFactSales loads data into a data warehouse staging table. The package
sources its data from numerous CSV files exported from a mainframe system. The CSV file
names begin with the letters GLSD followed by a unique numeric identifier that never
exceeds six digits. The data content of each CSV file is identically formatted.
A package named LoadFactFreightCosts sources data from a Windows Azure SQL Database
database that has data integrity problems. The package may retrieve duplicate rows from the database.
The package variables of all packages have the RaiseChangedEvent property set to true. A
package-level event handler for the OnVariableValueChanged event consists of an Execute
SQL task that logs the System::VariableName and System::VariableValue variables.
Data Models
SQL Server Analysis Services (SSAS) is used to host the Corporate BI multidimensional
database. The Corporate BI database contains a single data source view named Data
Warehouse. The Data Warehouse data source view consists of all data warehouse tables. All
data source view tables have been converted to named queries.
The Corporate BI database contains a single cube named Sales Analysis and three database
dimensions: Date, Customer and Product. The dimension usage for the Sales Analysis cube is
as shown in the following image.
The Customer dimension contains a single multi-level hierarchy named Geography. The
structure of the Geography hierarchy is shown in the following image.
The Sales Analysis cube’s calculation script defines one calculated measure named Sales Per Capita
The calculated measure expression divides the Revenue measure by the Population
measure and multiplies the result by 1,000. This calculation represents revenue per 1,000
people.
The Sales Analysis cube produces correct Sales Per Capita results for each country of the
world; however, the Grand Total for all countries is incorrect, as shown in the following
image (rows 2-239 have been hidden).
A role named Analysts grants Read permission for the Sales Analysis cube to all sales and
marketing analysts in the company.
SQL Server Reporting Services (SSRS) is configured in SharePoint integrated mode. All
reports are based on shared data sources.
Corporate logo images used in reports were originally configured as data-bound images
sourced from a SQL Server relational database table. The image data has been exported to
JPG files. The image files are hosted on the Internet-facing web server. All reports have been
modified to reference the corporate logo images by using the fully qualified URLs of the
image files. A red X currently appears in place of the corporate logo in reports.
Users configure data alerts on certain reports. Users can view a report named Sales
Profitability on demand; however, notification email messages are no longer being sent when
Sales Profitability report data satisfies alert definition rules. The alert schedule settings for the
Sales Profitability report are configured as shown in the following image.
Business Requirements
Data Models
Users must be able to:
• Provide context to measures and filter measures by using all related data warehouse
dimensions.
• Analyze measures by order date or ship date.
Additionally, users must be able to add a measure named Sales to the report canvas by
clicking only once in the Power View field list. The Sales measure must allow users to
analyze the sum of the values in the Revenue column of the FactSales data warehouse table.
Users must be able to change the aggregation function of the Sales measure.
Analysis and Reporting
A sales manager has requested the following query results from the Sales Analysis cube for
the 2012 fiscal year:
• Australian postal codes and sales in descending order of sales.
• Australian states and the ratio of sales achieved by the 10 highest customer sales made
for each city in that state.
Technical Requirements
ETL Processes
If an SSIS package variable value changes, the package must log the variable name and the
new variable value to a custom log table.
The StageFactSales package must load the contents of all files that match the file name
pattern. The source file name must also be stored in a column of the data warehouse staging
table.
In the design of the LoadFactSales package, if a lookup of the dimension surrogate key value
for the product code fails, the row details must be emailed to the data steward and written as
an error message to the SSIS catalog log by using the public API.
You must configure the LoadFactFreightCosts package to remove duplicate rows, by using
the least development effort.
Data Models
Users of the Sales Analysis cube frequently filter on the current month’s data. You must
ensure that queries to the Sales Analysis cube default to the current month in the Order Date
dimension for all users.
You must develop and deploy a tabular project for the exclusive use as a Power View
reporting data source. The model must be based on the data warehouse. Model table names
must exclude the Dim or Fact prefixes. All measures in the model must format values to
display zero decimal places.
Analysis and Reporting
Reports must be developed that combine the SSIS catalog log messages with the package
variable value changes.
###EndCaseStudy###
You need to configure the LoadFactFreightCosts package to address the data integrity issues.
Which data flow component should you use?
To answer, drag the appropriate data flow component to the answer area.
You need to configure SSRS to meet the maximum number of snapshots requirement
###BeginCaseStudy###
Case Study: 2
Contoso, Ltd
Background
You are the business intelligence (BI) solutions architect for Contoso Ltd, a multinational
sales company with offices in London, Madrid, Paris, Brisbane, Tokyo, and New York.
Contoso sells office consumable products such as pens, printer ink, and paper.
You produce solutions by using SQL Server 2012 Business Intelligence Edition and
Microsoft SharePoint Server 2010 Enterprise Edition with SP1.
Technical Background
Contoso’s products are categorized by using four levels while some use only two or three
levels. Products are categorized as shown in the following table.
Contoso sells products through mobile sales staff, direct marketing, and its website. Sales
personnel are located in various regions around the world, and each region has a sales
manager who is paid a quarterly bonus based on the total sales in the region during the
quarter. Regions are categorized as shown in the following table.
SQL Server Analysis Services (SSAS) is used to host a multidimensional database. The
database contains a single cube named Sales and three database dimensions named Products,
Regions, and Date. A single measure named Sales Total has been defined in the cube. The
data source for the database is a SQL Server data warehouse.
The Products dimension contains a single user-defined hierarchy named Products. To prevent
the display of empty members when users browse the Products dimension, the Extract,
Transform, and Load (ETL) process populates all missing values as shown in the following diagram.
The structure of the Products hierarchy is shown in the following diagram.
The Regions dimension contains a single user-defined hierarchy named Sales Regions. The
dimension is based on a single dimension table in the data warehouse and the attribute
relationships have not been modified since the dimension was created by using the
Dimension wizard. The structure of the Sales Regions hierarchy is shown in the following diagram.
The Date dimension contains a single user-defined hierarchy named Calendar. The structure
of the Calendar hierarchyis shown in the following diagram.
A role named UserRegions has been created in the SSAS database that will be used to filter
members in the Regions dimension based on the authenticated user.
Administrative staff from around the world will produce sales reports with Microsoft Excel
2010 based on the Sales cube.
Developers will produce reports with SQL Server Reporting Services (SSRS) based on the
Sales cube and the reports will be delivered to users through report subscriptions and a web
browser-All users log on to an Active Directory Domain Services (AD DS) domain named
contoso.com.
All client computers and servers are joined to the contoso.com domain.
Business Requirements
The BI system must meet the following reporting requirements:
• Display all sales figures in euro currency, regardless of the client’s reporting location
• Include a new measure named AD Sales that calculates average daily sales for a
selected month
• Support near real-time reporting while maintaining good performance for
multidimensional queries
• Support reports that show currency exchange rates
• Deliver executive reports that are parameterized and rendered from report snapshots
In addition, cube objects must use terms familiar to users from around the world. For
example, in the SalesRegions hierarchy, users from Great Britain must see the State level
presented as County when browsing the Sales cube.
The Sales cube must support a new measure group named Sales Planning. The measure group
must consist of a single measure named Sales Plan that enables the management team to use
Excel 2010 to enter sales plans for future monitoring.
The BI system must meet the following technical requirements:
Architecture requirements
o The system must use separate servers for each of the following components:
SQL Server Database Engine
SQL Server Integration Services
SQL Server Analysis Services in multidimensional mode SharePoint Server with
the Reporting Services Add-in
o All servers must be installed using U.S. regional settings.
o The system must source currency exchange rate data from a database hosted in
Microsoft Azure SQL Database.
Security requirements
o When possible, the system must use Windows authentication for all database
connections.
o The system must prevent users from querying data from outside of their region.
o The system must allow certain users to query data from multiple regions.
Development requirements
o When browsing the Products hierarchy, repeating values for different levels of a given
drill-path must be avoided. For example, Papers -> Copy Paper -> Copy Paper ->
Copy Paper should appear simply as Papers -> Copy Paper.
o The system must support report snapshots. The default maximum number of retained
snapshots must not exceed five.
###EndCaseStudy###
You need to configure SSRS to meet the maximum number of snapshots requirement.
What should you do? (Each answer presents a complete solution. Choose all that apply.)