Reporting – Liberty Grove Software https://libertygrovetech.com Tue, 08 Aug 2023 13:35:30 +0000 en-US hourly 1 https://wordpress.org/?v=6.9 https://libertygrovetech.com/wp-content/uploads/2025/04/cropped-LGS-2014-stacked-logo-500-px-Linked-In-square-TREES-2-1-32x32.webp Reporting – Liberty Grove Software https://libertygrovetech.com 32 32 Ready for Greater Data Insights? 6 Reasons to Implement Power BI https://libertygrovetech.com/6-reasons-to-implement-power-bi/ Tue, 08 Aug 2023 13:35:30 +0000 https://libertygrovetech.com/?p=3642

Have you ever wished you had a crystal ball to help you gain critical data insights for your business? That crystal ball is here – in the form of Power BI. And, unlike most things that sound too good to be true, this powerful tool is the real deal.

A team of analysts uses BI to analyze financial reports using a tablet with paper on the table to explore the Power BI dashboard screen and gain data insights into the business.

But here’s the best part: your team doesn’t need to be technical geniuses to master it!

Power BI is a business analytics solution that, in the words of Microsoft, “lets you visualize your data, share insights across your organization, or embed them in your app or website.”

It’s a powerful combination of business intelligence (BI), reporting, and data visualization tools and services that can increase the effectiveness of people and teams.

In addition to connecting with other Microsoft products and services, Power BI sets itself apart with streamlined publication and distribution tools. But is it the best choice for your business?

How does Power BI help companies with data insights?

Every industry’s market (and each organization within it) is distinct, with unique issues and challenges. As a result, companies use a variety of selling and marketing strategies to offer a range of services and goods.

Systems in every industry produce a range of data in sizes ranging from terabytes to exabytes. Business intelligence (BI) is how organizations evaluate this data and deliver information that informs decisions. These business intelligence-based decisions are essential to the company’s performance.

BI is a rapidly expanding technology that now permeates every industry worldwide. The success of BI services in providing clients with a personalized experience is very high. Businesses are utilizing the potential of this technology by using a large variety of BI Services. Power BI is one such tool that facilitates making complex business decisions.

Why use Power BI?

Microsoft’s cloud-based business intelligence service, Power BI, is a product of the company’s extensive experience with relational databases, including Access, SQL Server, and others. An effective business intelligence platform enables organizations to thoroughly clean data before turning it into useful information. It extensively examines data and imparts insightful knowledge.

The Gartner Magic Quadrant for Analytics and Business Intelligence Platforms places Microsoft in the leaders’ quadrant, making it the dominant vendor in the market. Gartner considers Power BI a go-to platform based on its vision of completeness and capacity to implement BI solutions.

Here are the top 6 reasons why Power BI is best-in-class:

1. Usability

Power BI’s interface is straightforward and user-friendly. Utilizing Power BI doesn’t require any programming knowledge. It features built-in intelligence that guides your choice of attributes for your reports by recommending the most appropriate reporting component.

Example: When you choose sales and category after selecting the appropriate data source, it will automatically determine the proper column chart for you. Similarly, if you choose sales and location, the map chart will be recognized automatically.

It connects to the data source via a simple user interface (UI). When choosing a data source, you can select attributes for your reports by dragging and dropping them.

QA is one feature where you can type a question, and Power BI will respond with a value or graph depending on the inquiry, among other capabilities. For instance, asking, “What was this year’s revenue by month” will provide a diagram showing your revenue broken down by month.

2. Simple to Learn

Power BI is built on the foundation of Excel and creates reports similarly. It makes it easy to learn because Microsoft Excel is a commonly used and widely accepted piece of software.

The principles of Microsoft SQL Server and the Microsoft Access database are the sole foundation for data modeling. As a result, Power BI’s data modeling is straightforward for consumers and programmers to embrace.

The Power BI website provides many learning resources to help you understand its impact and flexibility.

3. Easy to Work With

Power BI offers tools that make it simple to collaborate. Users can work with colleagues in “app” workspaces to produce interactive reports and dashboards. Users can combine dashboards and reports with apps that the user can share with a broader audience. When you have a small audience, Power BI makes sharing dashboards or reports a breeze using the mobile app.

The report can be exported as a PowerPoint Presentation and printed by the user. They can even post reports and dashboards on open websites for viewing and interaction by anybody worldwide.

4. Value for Money

The user can create simple and complex reports and dashboards using the free Power BI desktop. Power BI’s pro licenses are reasonably priced (USD 10 per month). Power BI includes a premium version that enables you to tailor costing based on your audience usage if you have a larger audience.

5. Broad Data Source Coverage

Microsoft Excel, SQL Server, MySQL, Oracle, IBM DB2, IBM Netezza, IBM Informix, PostgreSQL, Sybase, SAP Hana, Amazon Redshift, Azure SQL Database, Azure SQL Data Warehouse, Azure Analysis Services database, Azure Blob Storage, MailChimp, Facebook, GitHub, Salesforce, and many other data sources are just a few of the data sources for which Power BI has connectors.

6. Effective Visualization 

Thanks to Microsoft, the Power BI visualization is SDK (software development kit) available, along with a sizable library for unique visualization. This feature allows users to modify the UI to suit their needs.

A tool called Query Editor, which is incredibly powerful and flexible and has a ton of functionality, is provided by Data Shaping Power BI. The fact that it is self-documenting is the most significant feature. It also allows you to learn more about the DAX language.

Data Modeling

Any well-developed BI solution is robust. Based on knowledge of the SQL database and Cube technology, Power BI offers a variety of highly effective data modeling solutions.

Power BI’s rivals who provide comparable functionality include several others, including ZOHO Reports, IBM Watson Analytics, Sisense, and Google Analytics, to mention a few. However, the ease of use and quick response time of Microsoft Help made Power BI stand out in the industry.

Conclusion

Power BI and other business intelligence tools give you the tools you need for better strategic analysis of how to consolidate your data streams, improve accessibility, and gain more intelligent insights. They also enable you to analyze your data and keep your business operating efficiently.

It’s simple to understand why Power BI is popular among businesses seeking better insights, dynamic dashboards, and robust reporting. 

Time to consider Power BI?

Liberty Grove Software can help you use many tools and services to help you achieve a successful Power BI implementation that will enhance data insights throughout your organization.

Contact Liberty Grove Software for a complimentary consultation on how Power BI can give you the data insights you need to improve decision-making and compete at a higher level.

Related reading:

Everything You Need to Know about the Gifts of Microsoft Power BI – Liberty Grove Software

Liberty Grove Software is an established Microsoft Partner that focuses on providing customers with sales, service, and support for Microsoft Dynamics 365 Business Central/NAV solutions and training and upgrades.

Over more than 25 years, Liberty Grove has assisted hundreds of customers with businesses ranging from small to mid-sized to Microsoft Partners in implementing, training, customization, and upgrading Microsoft Dynamics ERP solutions.

The organization is one of only a few companies worldwide that Microsoft recognizes as qualified to provide Business Central/NAV Upgrade Service Centers.

]]>
Checklist: Streamline Your Food Business and Boost Confidence with a ERP Solution for Recall and Traceability Compliance https://libertygrovetech.com/food-recall-and-traceability-compliance/ Mon, 03 Apr 2023 14:51:16 +0000 https://libertygrovetech.com/?p=3548

CDC estimates that roughly 1 in 6 Americans (or 48 million people) get sick yearly, 128,000 are hospitalized, and 3,000 die of foodborne diseases. (Burden of Foodborne Illness: Findings | Estimates of Foodborne Illness | CDC)

That’s a food manufacturer’s nightmare.

Barcode and tasty fresh fruits on counter at the wholesale market - Recall and Traceability Compliance

Recalls occur. Often, the problem is beyond your control. In a recall emergency, it can be challenging to know what to do and in what order to expedite the recall process with as little friction as possible.

If you’re using an ERP solution, you’re already ahead of the situation. Many tools you’ll need for recalling goods are already available within your ERP.

This checklist has two sections

Section 1: What to consider before a recall occurs

  • Planning for the worst-case scenario ensures preparedness for what you hope will never happen. That’s where your ERP comes in – it’s your recall assistant.

Section 2: Step-by-step procedures to work through the recall process

  • When the unthinkable happens, being prepared enables swift team action and helps maintain customer confidence – and your brand reputation.

Part I: Getting Ready 

  • Establish a Documented Recall Team

Those responsible for decision-making, quality assurance/technical advisory, media communication, complaint investigation, contacting accounts, government contact, and legal counsel.

  • Complaint Form

While you may only need this once a recall occurs, preparation is a good idea. The complaint file has three sections:

• Documentation of the initial complaint information 

• Investigation of the complaint and documentation of the findings

• Acting on the findings of the investigation

  • Quick Information Access

An ERP solution enables quick access to critical information about products, manufacturing processes, the location of items in the supply chain, and contact information for those during a product recall.

  • Capability to Find Any Item in the Supply Chain

Throughout the supply chain, an ERP solution (along with barcodes and RFID) enables traceability down to specific ingredients. Connect production and quality management testing to determine which other products may pose a risk due to cross-contamination or faulty processes.

  • Actionable Communication Transparency and Openness

Open communication with suppliers, customers, and government agencies helps mitigate the impact of a food product recall. ERP users can quickly email documents, alerts, instructions (such as how/where to dispose of products, etc.), and updates to contacts in bulk or individually from the software’s interface. When a faster recall is required, manufacturers can temporarily open access to product location information by authorizing external users, such as suppliers or customers, to access and pull data from their systems.

Contact your Food Inspection Agency immediately if you suspect you sold an unsafe or illegal food product to another manufacturer, distributor, or retailer.

Part II: Plan of Action

  • Engage the Recall Management Team
  • Inform the Food Inspection Agency
  • Identify all products in the recall
  • Detain and segregate all recalled products under your company’s control
  • Create a Press Release (if required)
  • Make a Distribution List
  • Prepare and distribute the Recall Notice
  • Check the recall’s effectiveness
  • Control the recalled item(s)
  • Decide what to do with the recalled product(s)
  • If the recall occurred at your facility, resolve the issue

Quickly tracking and removing defective problems with clear communications can significantly mitigate a product recall’s financial and brand reputational consequences. A company with the tools and procedures for a covert recall shows its customers and supply chain partners that it values its relationships.

Customers judge a company on how well it prepared for a recall and how it responded when things went wrong.

Be prepared.

About the author

Liberty Grove Software is an established Microsoft Partner that focuses on providing customers with sales, service, and support for Microsoft Dynamics 365 Business Central/NAV solutions and training and upgrades.

Over more than 25 years, Liberty Grove has assisted hundreds of customers with businesses ranging from small to mid-sized to Microsoft Partners in implementing, training, customization, and upgrading Microsoft Dynamics ERP solutions.

The organization is one of only a few companies worldwide that Microsoft recognizes as qualified to provide Business Central/NAV Upgrade Service Centers.

]]>
Manufacturers Integrate Data from Multiple Systems to Increase Profits https://libertygrovetech.com/manufacturers-integrate-data-to-increase-their-bottom-line/ Mon, 07 Nov 2022 14:55:29 +0000 https://libertygrovetech.com/?p=3422

Typically, when seeking to improve the bottom line, manufacturers consider increasing income/profits and decreasing costs. They may even think about increasing efficiency.

However, in this day and age, manufacturers can also look at the role of data in every aspect of their operations.

In this article we discuss how manufacturers integrate data from multiple systems and help increase their bottom line.

Manager engineer checks and control automation robot arms machine in intelligent factory industrial on real-time monitoring system software. - Manufacturers Integrate Data

Data, Data Everywhere, and Not a Drop to Exploit!

To understand how data can increase the bottom line, we can begin by asking a lot of questions:

Where does it come from?

Where is it applied?

How do we use it?

How and where do we store it?

How do we analyze it?

Who analyzes it?

Do we have enough data?

Or, do we have too much?

Manufacturing data can emanate from numerous sources

Where Is Your Data Coming From?

Manufacturing data can emanate from numerous sources, both internal and external.

Internal Data Sources

Every report generated in every department – finance, operations, sales, marketing, HR … – contains current data potentially useful as a window into a manufacturer’s ongoing business. Historical data reveals positive and negative trends that can guide present and future business decisions.

And then, a variety of sensors have become ubiquitous in many manufacturing environments. Sensor technology plays an essential role in machine automation. Sensors provide information about products during the manufacturing process. They deliver updates about the condition of the equipment. These sensors guide maintenance and prevent downtime. In other scenarios, sensors provide feedback on the motion of the motor to ensure accurate positioning.

If BI (Business Intelligence), AI (Artificial Intelligence), and ML (Machine Learning) technologies are in use, they will provide a treasure trove of additional data.

By finding relationships in your data

External Data Sources

The IoT (Internet of Things) and IIoT (Industrial Internet of Things) are significant external data sources. IIoT devices acquire and analyze data from connected assets, locations, and people. They provide insights that can be used to speed the manufacturing process, predict failures, reduce downtime, and increase automation.

The first step is to get hold of the data. The second step is to use it wisely. This can be accomplished by finding relationships in the data to determine cause and effect and improve efficiency and predictability. The ultimate goal is to bring computing to these data streams to shorten decision times, lower expenses, and, wherever possible, eliminate risk. 

Integrate Data from Multiple Systems, Increase the Bottom Line

Think about all the data sources discussed above, just the tip of the iceberg. Individual, standalone data points are of minimal use. When integrated from multiple systems and sources, these data points can yield actionable results.

Think, for example, about triangulation. If you were to try and determine where a particular cell phone call had been made, you could only do so, with any hope of success, via triangulation. Using cell tower triangulation (3 towers), it is possible to determine a phone location within an area of less than a square mile.

Multiple data points, at least 3, are required. A single data point may be an aberration. Two data points provide only a linear conclusion. The third data point provides greater dimensions.

Let’s talk about how manufacturers integrate data…

… and how to use it to your advantage, how manufactures integrate data from multiple systems in ways that can increase your bottom line.

Contact Liberty Grove Software by calling 630-858-7388 or emailing nav@libertygrovetech.com.

Related Post

Learn About Why a Practical Shop Floor Data Collection Solution is Needed

]]>
Liberty Grove Will Be Presenting at Community Summit 2022: Read All About It! https://libertygrovetech.com/liberty-grove-will-be-presenting-at-community-summit-2022-read-all-about-it/ Mon, 26 Sep 2022 14:46:55 +0000 https://libertygrovetech.com/?p=3403

If you are a veteran Microsoft Dynamics user, you probably know about Community Summit. For those of you who are new to Microsoft Dynamics, suffice it to say that the annual Summit conference is the largest independent gathering of Dynamics users in the world!

Microphone over an abstract blurred photo of a community summit conference hall

This year, Summit will be held at the Gaylord Palms Resort & Convention Center from October 10 to 13th in Orlando, Florida. The content will focus on people considering Dynamics, new to Dynamics, migrating to Dynamics, and optimizing their current Dynamics software system.

Liberty Grove Will Be at Community Summit. Will You Join Us?

Community Summit Conference Image

In fact, we will be presenting three sessions with topics ranging from managing supply chain disruptions to onboarding employees.

Andrew Good, Liberty Grove CEO, will host all three sessions.

Here are the details:

Session 1:
Managing Supply Chain in Disruptive Times with Dynamics 365 Business Central

Date: Tuesday, October 11
Time:
11:00am to 12:00pm
Room:
Destin 2 – Convention Center, Level 2
Synopsis:
Leverage the tools at your disposal in Business Central to streamline operations, identify, and manage problems. Learn about several tools that you can use to identify supply chain problems early and keep them under control.

Session 2:
How to Onboard New Employees and Ensure Accuracy in NAV/BC
Date:
Tuesday, October 11
Time:
2:30 to 3:00pm
Room:
Gainesville 1 – Convention Center, Level 2
Synopsis:
Let’s talk about what has worked for getting new people on board and effective with BC and NAV. This will have an operations focus.

Session 3:
Manufacturing – Managing Costing with Business Central
Date:
Thursday, October 13
Time:
11:15am to 12:00pm
Room:
Naples 2 – Convention Center, Level 2
Synopsis:
Costing is key during the best of times. Let’s look at strategies to evaluate the impact of cost changes on your business. We know costs are changing, from materials and utilities to labor. Learn how to create a test company to work in and learn the elements of a good cost testing plan.

About Andrew Good

Picture of Andrew Good of Liberty Grove

Andrew Good, our fearless leader, is Liberty Grove’s CEO.

He is a Software Engineer, project manager, analyst, manufacturing expert, and Lead NAV Microsoft Certified Trainer. Andrew works on implementations, project management, business process design, application analysis, end-user training, and implementation project management training. Andrew is part of the inaugural group of Dynamics Credentialed Professionals for Core Application Setup in Microsoft Dynamics NAV.

Let’s Say You Are Interested in the Topics We’re Presenting, but You Cannot Attend

We hope you will join us, but if you’re interested in one or more of our presentations but cannot attend, we would be happy to put you in touch with Andrew so you can chat. Just contact Liberty Grove Software. Call 630-858-7388 or email us at nav@libertygrovetech.com.

Recommended Blog to read:

The Top Key Benefits to Migrating Your ERP to the Cloud

]]>
Learn About Why a Practical Shop Floor Data Collection Solution is Needed https://libertygrovetech.com/learn-about-why-a-practical-shop-floor-data-collection-solution-is-needed/ Mon, 11 Apr 2022 14:55:20 +0000 https://libertygrovetech.com/?p=3074
Putting Shop Floor Data in the hands of employees

You can’t escape it. The unprecedented amount of data bombards you day and night, day in and day out. It’s a never-ending stream. Before establishing a practical shop floor data collection system, you must determine what data is essential for your business and what data is unimportant, merely a distraction.

Whether it’s data generated internally, data manually entered into your system, data from sensors, or data from the IIoT (Industrial Internet of Things), take a breath and decide what to keep and what to discard. Only then can you set up a shop floor data collection system that makes sense. Only then will your data collection system become practical.

The Benefits of a Practical Shop Floor Data Collection System: Your Data Tells a Story!

A practical shop floor data collection program will digitally transform your factory, allowing you to make data-driven decisions that will improve productivity based on real-time information, machine monitoring, and benchmark analysis.

You will gain real-time visibility with apps that collect data from the people, machines, and sensors throughout your operations. You will be able to optimize production by identifying and solving bottlenecks in real-time.

You will be able to calculate cycle time and first-pass yield data for a complete picture of your operation’s productivity. You will be able to minimize downtime by properly balancing work across stations.

You will uncover opportunities to improve operations along your production line. You will be able to build triggers and time-saving workflows by incorporating events and measurements recorded by devices and sensors. You will have the traceability you need for accountability.

You will be able to resolve issues and be proactive based on data, not guesswork.

Say Good-Bye to Manual Tracking

At Liberty Grove Software, the solution we recommend and offer our manufacturing clients is Shop Floor Insight, a Manufacturing Execution System (MES) add-on for Microsoft Dynamics 365 Business Central that significantly reduces the labor costs and errors associated with manual data entry. Shop Floor Insight allows employees to use barcode scanners or touch screens to speed up shop data input for time, material, quality, etc.

The numerous benefits of a shop floor data collection system include, but are by no means limited to, the following twelve:

  • Eliminate manual time entry with barcode scanning for Production Orders, Jobs, Service Orders, and Fixed Assets
  • Capture operational data on the shop floor, including consumption, output, scrap, and quality
  • Record and report on non-productive and rework time for better insight into lost productivity
  • Record time and attendance based on employee shifts with exception reporting
  • Quickly approve time with exception-based supervisor approvals
  • Automatically calculate overtime and shift differentials for integration with payroll systems
  • Streamline shop floor data capture
  • Enhance data capture and management
  • Significantly reduce rework
  • Analyze job progression and costing
  • Maintain a detailed production history
  • Immediate notification to supervisors/chain of command

Shop Floor Insight ROI (Return on Investment)

Shop Floor Insight pays for itself in as little as three months. By automating your shop floor data collection, you will increase data accuracy, reduce unnecessary overhead costs, and improve the efficiency of your staff by enabling them to focus on value-added tasks to drive business performance.

The Shop Floor Insight ROI calculator will help you estimate the key financial benefits of the solution:

  • How much overhead time can be saved by automating your shop floor data collection
  • Savings gained from increased quality
  • The increase in sales driven by improved efficiency and quality
  • The payback period for Shop Floor Insight

Take the next steps…

Contact Liberty Grove Software by calling 630-858-7388 or emailing nav@libertygrovetech.com.

Related Post

The Top Key Benefits to Migrating Your ERP to the Cloud

]]>
Everything You Need to Know about the Gifts of Microsoft Power BI https://libertygrovetech.com/everything-you-need-to-know-about-the-gifts-of-microsoft-power-bi/ Mon, 29 Mar 2021 13:43:23 +0000 https://libertygrovetech.com/website_c6b6bfc4/everything-you-need-to-know-about-the-gifts-of-microsoft-power-bi/

test alt text

When you integrate Power BI into your Microsoft Dynamics ERP system, it’s like receiving a series of gifts from Microsoft. And those gifts are all geared toward making your ERP more powerful, more intelligent, more proactive, and more inciteful. That’s why it’s called Power BI. It adds power and Business Intelligence to your solution.

is part of the Microsoft Power Platform. It is a business analytics service that provides interactive visualizations and business intelligence capabilities with an interface simple enough for end-users to create their own reports and dashboards.

Here are brief descriptions of 8 gifts that are showered upon you when you implement Power BI:

Power BI Desktop – The Windows-desktop-based application for PCs and desktops, primarily for designing and publishing reports to the Service.

Power BI Service – The SaaS-based (Software as a Service) online service. This was formerly known as Power BI for Office 365 and is now referred to as PowerBI.com or Power BI.

Power BI Mobile Apps – The Power BI Mobile apps are for Android and iOS devices and Windows phones and tablets.

Power BI Gateway – Gateways are used to sync external data in and out of Power BI and are required for automated refreshes. In Enterprise mode, they can also be used by Flows and PowerApps in Office 365.

Power BI Embedded – The Power BI REST API can be used to build dashboards and reports into the custom applications that serve Power BI users, as well as non-Power BI users.

Power BI Report Server – An On-Premises Power BI Reporting solution for companies that will not or cannot store data in the cloud-based Power BI Service.

Power BI Premium – Capacity-based offering that includes flexibility to publish reports broadly across an enterprise, without requiring recipients to be licensed individually. This provides greater scale and performance than shared capacity in the Power BI service.

Power BI Visuals Marketplace – A marketplace of custom visuals and R-powered visuals.

Although the listed above are only briefly described, it is still a lot to take in, no less understand. There’s a seasoned team of Microsoft professionals at who are at your service. We can answer all of your questions and take you on a deep dive into Power BI, the Microsoft Power Platform, and the Microsoft Dynamics 365 Power Extensions, including Power BI, the Power Apps, and Power Automate.

Take the next step…

Contact by calling 630-858-7388 or emailing nav@libertygrovetech.com. Related Post

ERP in the Cloud: A Bright Spot in the Pandemic

]]>
NAVUG Ontario (Toronto) First Meeting https://libertygrovetech.com/navug-ontario-toronto-first-meeting/ Thu, 19 Nov 2015 17:44:14 +0000 https://libertygrovetech.com/website_c6b6bfc4/navug-ontario-toronto-first-meeting/

Join Liberty Grove Software at the NAVUG Ontario (Toronto) Inaugural User Group Meeting at the Microsoft Office, 1950 Meadowvale Blvd, in Mississauga where a wealth of practical ideas, collaborative learning, in-depth knowledge and unsurpassed networking will be offered. NAVUG events are designed exclusively for Dynamics NAV users.  Attendees will benefit from:

  • Interactive Discussion sessions — Arrive with questions and leave with answers and solutions!
  • Educational Sessions—ranging from Finance and Reporting to Manufacturing and Supply Chain Management—so you can focus on what’s most important to you. Details will be provided on the registration page.
  • Plentiful opportunities to network, interact with and learn from others who use Dynamics NAV every day.

Why attend local NAVUG Events? We recognize that NAVUG events are one of several Dynamics NAV events you may be considering attending. So why choose NAVUG?

  • NAVUG meetings are designed and led by users, for users… Every attendee is focused on Dynamics NAV; learn how they’ve overcome the challenges you face every day.
  • Interactive learning at its best… Engage in lively group discussions and give-and-take—and return home with practical ideas you can implement right away.
  • We focus on the technology you’re using right now to help you get maximum value from your current solution.

You’ll benefit from plenty of unique perspectives and knowledge from Dynamics NAV users like you.

]]>
Dynamics NAV RDLC Reports: Avoiding The Upgrade Conversion Crunch https://libertygrovetech.com/dynamics-nav-rdlc-reports-avoiding-the-upgrade-conversion-crunch/ Thu, 19 Jul 2012 02:55:44 +0000 https://libertygrovetech.com/website_c6b6bfc4/dynamics-nav-rdlc-reports-avoiding-the-upgrade-conversion-crunch/

Introduction

As a Microsoft-certified Dynamics NAV Upgrade Center, we see a lot of upgrades at Liberty Grove Software. In recent years, those upgrades have been to NAV 2009. Increasingly, with the end of NAV classic reporting now in sight, many of our upgrades have involved the conversion of classic reports to their RDLC equivalents.

For those readers not involved in the technical side of Dynamics NAV, RDLC stands for “Report Definition Language Client”. RDLC is part of Microsoft’s SQL Server Reporting Services toolset and it is the new standard technology for Dynamics NAV reporting.  RDLC and classic reports are supported in tandem in NAV 2009, but in NAV 2013 only RDLC reports will be available. So if you’ve already made the move to NAV 2009, read on to understand why you’re in a great position to do your Classic-to-RDLC migration now.  If you’re planning a NAV upgrade to 2009 or 2013, it is important to understand the report conversion process in order to invest your upgrade dollars wisely.

Understanding Your Report Upgrade Circumstances

If your company uses only Dynamics NAV standard reports, or standard reports with only minor modifications, you don’t need to read the rest of this article. Microsoft provides RDLC versions of all its standard reports out of the box.

The only caveat here is that, if you’ve made minor customizations to standard reports, do not attempt to retain these customizations by carrying your versions of the reports forward in the upgrade process and expecting Microsoft’s Create Layout Suggestion tool to make everything right. If you do this, you’ll overwrite all the manual work Microsoft did to fix up the RDLC versions of their standard reports, which will cause you a lot of grief. You’re much better off to manually reapply your minor customizations to the standard Microsoft RDLC reports instead.

You also don’t need to devote special attention to the report portion of your upgrade if you have only simple custom reports (i.e. lists). In this case, the Create Layout Suggestion tool will help, not hinder, and the total time required for such conversions will be minimal.

If, however, you have complex custom reports, especially if you have a lot of them, you should seriously consider making their conversion a separate project. As with any upgrade, the fewer changes you try to pack into one project (or one phase of a project), the better your chances of completing it on time, on budget, and without any significant problems.

Because they are generally independent units of code, reports are ideal candidates for this divide-and-conquer strategy. But there’s a much bigger reason to take this approach, and it has to do with quality of review.

Put Your Money on the Turtle, Not the Hare

When faced with a large body of work, most of us like to tackle it as quickly and intensely as we can, mainly to get ourselves over the hump.But when tasked with converting a large number of complex reports, it’s actually much better to take a slower approach, and the reason has to do with the completeness of each report upgrade and the quality of review.

If you ask your staff to review a large number of complex, completely rewritten reports in a short period of time, they’ll likely take a rubber stamp approach to the review process. That’s human nature.

Yet, if you’ve gone to the trouble of creating complex reports in the first place, not to mention paying to have them rewritten, they’re probably important to you. They probably play a key role in decision-making across your company.

Besides, if you’re on NAV 2009, you can still use the classic version of your reports while you create your RDLC equivalents. So go slow, convert and review one report at a time, and give your staff the time they need to get it right. This is especially true if your reports have many options and filters, which require a lot more time to test properly. The alternative is to rush things and risk running your company on potentially erroneous information.

The Problem With Microsoft’s RDLC Conversion Tool

So, why do you have to worry about rewriting all your complex custom reports in the first place, instead of using Microsoft’s Create Layout Suggestion tool to do it for you? The simple answer is, for reports with any degree of complexity, the tool doesn’t work. It’s not really Microsoft’s fault, either. The fact is, classic NAV reports and RDLC reports use completely different technologies, and are built on highly divergent paradigms. Automatically converting complex reports from one to the other is a tall order. So you’re going to need a programmer, which brings us to…

Resourcing Your Project

If, as you contemplate how you’re going to resource your report upgrade project, you find yourself thinking, “These are just reports. How hard can it be?”, and a picture pops into your head of that student programmer you just hired for the summer, think again. Rewriting a report created in one technology into the equivalent report in a completely different technology, especially if the report is not properly documented, is not a task for the inexperienced. It’s the subtle errors that will get you (such as a report that functions correctly in all circumstances except one), and inexperienced programmers are sometimes vulnerable to subtleties.

Assigning the conversion project to someone who has a lot of other tasks on his or her plate isn’t a good idea, either. Rewriting complex custom reports is a detailed, methodical, often monotonous task – not the type of work you can do with a lot of interruptions.

For these reasons, you should also consider using an official NAV upgrade center to do the work. There’s nothing like repetition to hone one’s skills at specific tasks, especially if the upgrade center has programmers who specialize in RDLC upgrades.

For more information on the technical aspects of converting classic reports to their RDLC counterparts, see the series of posts that starts here: Converting Classic Reports To Rdlc Part 1.

]]>
Dynamics NAV 2009 RDLC Datasets https://libertygrovetech.com/dynamics-nav-2009-rdlc-datasets/ Wed, 27 Jun 2012 17:05:40 +0000 https://libertygrovetech.com/website_c6b6bfc4/dynamics-nav-2009-rdlc-datasets/

About This Post

This is a beginner-level post on RDLC dataset structures. Its objective is to clarify the structural variances in datasets passed from NAV classic reports to their RDLC counterparts, and what has to be done on the RDLC side to properly handle these variances.

It also addresses one important exception, which is the need, for performance reasons, to generate only one copy of each picture data element per report (the reason for this will become clear shortly).

RDLC Datasets 101

The first thing to understand about RDLC datasets is that their field definitions currently depend on the placement of fields in the classic section designer. Therefore, if you create a report using one table that contains five fields, and you place only three of these fields in a section in the classic report, your RDLC dataset will only consist of the three fields you placed. This fact often leads programmers to place hidden fields in their classic report sections just so they can have access to those fields on the RDLC side.

The second thing to remember about RDLC datasets is that they are built from classic data items using the traditional nested loop approach to data processing, and are handed off as flattened datasets to the RDLC report as per this diagram,which is borrowed from an earlier post on converting classic reports to RDLC:

This means that if you’re going to filter your report data, you should attempt to do it on the classic side of the equation wherever possible. Otherwise, you may be bringing data over to your client that isn’t needed.

There’s also a third general caveat about these datasets. You can currently affect the datasets by placing C/AL code in the triggers for classic sections (i.e. in addition to any code you place in the data item triggers). Avoid using section code at all costs, as the classic sections no longer exist as of NAV 2013, so any functionality you build into them will eventually be lost.

So, keep in mind that

  • RDLC datasets are subsets of the classic datasets, restricted to the set of fields that are placed on the classic sections.
  • For performance reasons, you’re better to filter data on the classic side of the equation wherever possible.
  • You’re advised to avoid placing any data-related code in the classic section triggers.

You are now officially a graduate of RDLC Datasets 101.

The Header-Detail Model

The simplest data model is of course the one-table model, but it needs no explanation, so let’s start with a typical header-detail model:

You might use a structure like this when building a report showing customer data for each sales rep, say, sales by customer per sales rep. In this data item model, the Customer table is indented to the Salesperson/Purchaser table, which means the set of fields from the Customer table will be appended to the set of fields from the Salesperson/Purchaser table in the flattened RDLC dataset.

Using a simple model of Code and Name from the Salesperson/Purchaser table, and No. and Name from the Customer table, here is the dataset that you’ll get on the RDLC side:

Because the dataset has been flattened, the Salesperson_Purchaser_Code and Salesperson_Purchaser_Name  repeat for every customer record for that sales rep.

Note, also, that where no customers exist for a sales rep (as in the case of lonely Bart Duncan), the customer fields are blank.

Wondering where I’m getting this fabulous preview of the data in my RDLC dataset that lets me see exactly what’s going on? Just click the About This Report option in the upper right hand corner of the RDLC report viewer. You’ll have to do this twice (meaning you’ll have to run the report twice), first to activate this feature, second to use it, but it’s well worth this minor inconvenience.

So, how do you handle the above data structure on the RDLC side, especially if you’d like the sales rep data to be printed only once per rep?

You group the data like this (i.e. in this extremely simple report layout):

The data in this case is inside a table object consisting of two rows.

The first row is a group header row. You use this type of row on the sales rep fields so you can group the multiple instances of each salesrep into one instance, as per this group expression:

The second table row is simply a details row, which contains the true details of your report – the customer data. And that’s it. By using a group header table row (if you’re using a table object), and grouping on the sales rep code, you’re essentially de-flattening the flattened dataset you were handed.

The only caveat with this technique is to make sure that, in addition to indenting detail tables in classic report designer, you also restrict them via the DataItemLink property to their parent table (in this case, ensuring that the only customers presented for a sales rep are the ones that share that sales rep’s code). Otherwise, you’ll end up with what’s known in the database world as the Cartesian product (a combination of each parent record with all possible detail records), which, if the report you’re running is big enough, could result in your untimely death at the hands of either the database administrator or the person in charge of buying printer paper.

The Header-Multiple Detail Model

So, what if you’re feeling really ambitious and want to produce a report that shows sales data for each rep not only by customer, but by item, too, giving you a classic data item structure like this (in this case, the sales data in the Sales Header/Line tables is grouped and totaled on item):

Notice how the Customer and Sales Header tables are on the same indentation level?

This will produce the following data structure:

If you look closely, you will see that, although we’re still receiving one flattened dataset, it’s actually two distinct data sets lumped together, one with sales rep data combined with customer data; the other with sales rep data combined with item sales data.

Because we have two distinct data sets, we need two different “structures” on the RDLC side to hold them. In this case, I’ll use a List object that contains 1) text boxes for the sales rep data, 2) a table for the customer-related data, 3) a second table for the item sales data.

Because I’m using tables for each subset of data, I can then set table filters, as shown here to filter the item sales data:

This simple filter says, please only give me records that have something other than a blank value in the Sales Line No. field – in other words, the subset of item sales data.

Similarly, for the customer table, our table filter will filter out all the records other than the ones that have customer data.

Filter, Group, Filter, Group…

I could make this post nauseatingly long by showing you lots of other variations in RDLC datasets, but the principles for managing them are always the same: use filters and/or groups in combination with the appropriate RDLC reporting objects.

There is one situation worthy of special mention, however…

Beware the RDLC Blob

According to an old saying, a picture is worth a thousand words.

Unfortunately, when you’re dealing with flattened datasets, it can also be worth a giant performance headache. Imagine, for example, a report where you want to present information on the company’s sales reps with lots of bio information about the sales rep in the report header, including each rep’s photograph, followed by sales data in excruciating detail in the body of the report (as in lots of detail records). If each photograph is 1 megabyte, and there are 100 detail records for each sales rep (all joined to your sales rep data in some manner), you just bought yourself 99 megabytes of unwanted photograph data per sales rep.

Over on Waldo’s blog, which you will find is always worth a visit, he wrote about this a few weeks ago. To summarize his initial solution, he created an Integer table with one record at the same level as his top-level item, thereby creating two distinct subsets of data within his flattened dataset. In the section designer, he then moved the picture field into the design section for the Integer table, meaning the photo was produced only once for that report (see Waldo’s full blog on this here NAV 2009 RDLC Reporting: Working With Multiple Datasets).

This becomes trickier, however, when you have multiple records with pictures in the header portion of your dataset.

One of Waldo’s readers wrote in with a different solution, suggesting to eliminate duplicate photos by clearing the picture fields, i.e. CLEAR(NameOfPictureField), in the OnPreDataItem trigger of the data item immediately following the one containing the photo. This solution is described in this post: NAV 2009 RDLC Reporting: OutOfMemoryException When Printing An RDLC Report – Solution.

I was not successful in applying this exact solution in my data structure, but that could be entirely my failing. However, I was successful in applying the principle of this solution by using a little more explicit coding in the detail table’s OnAfterGetRecord trigger, where I cleared the variable holding the picture only after the first detail record had been successfully processed.

Hope this helps. If you have any questions, feel free to post them.

]]>
Converting Dynamics NAV Classic Reports To RDLC – Part 10 https://libertygrovetech.com/converting-dynamics-nav-classic-reports-to-rdlc-part-10/ Sat, 16 Jun 2012 03:05:07 +0000 https://libertygrovetech.com/website_c6b6bfc4/converting-dynamics-nav-classic-reports-to-rdlc-part-10/

Preceding Posts In This Series

In Part 1 of this series of posts, we examined the underlying technologies behind RDLC reporting. In Part 2, we looked at the basic process of converting Dynamics NAV classic reports to an RDLC layout.

Since then, we’ve been working through the specific issues you will encounter during the conversion process.

Since part 5, we’ve been walking through the conversion of Report 5703 – Transfer Order as an example of the issues you’ll face in converting a document report – the type of report likely to pose your biggest challenge in transitioning to RDLC layouts. This is the last post you will need to follow in order to complete the conversion of this specific report.

Aligning Your Classic Dataset and Your RDLC Table Structure

In Converting Dynamics NAV Classic Reports To RDLC – Part 9 of this series, we examined both the classic dataset and RDLC table structure that resulted from the Create Layout Suggestion process for Report 5703 – Transfer Order, and we found that the two were not in sync.

Indeed, it might be argued that synchronizing your classic dataset and your RDLC table structure is the most important task in the conversion process.

As with any conversion effort, it is important to understand both the source and target of the conversion. For this report, when the Show Internal Information checkbox is selected, and the number of dimension rows associated with the each order header record is 2, and the number of dimension rows associated with each order detail record is 3, the theoretical flattened source data structure is:

Header 1 Fields + Header 1 Dim. Row 1 Fields + Blank Detail Fields + Blank Detail Dim. FieldsHeader 1 Fields + Header 1 Dim. Row 2 Fields + Blank Detail Fields + Blank Detail Dim. FieldsHeader 1 Fields + Blank Header Dim. Fields + Detail 1 Fields + Detail 1 Dim. Row 1 FieldsHeader 1 Fields + Blank Header Dim. Fields + Detail 1 Fields + Detail 1 Dim. Row 2 FieldsHeader 1 Fields + Blank Header Dim. Fields + Detail 1 Fields + Detail 1 Dim. Row 3 FieldsHeader 1 Fields + Blank Header Dim. Fields + Detail 2 Fields + Detail 2 Dim. Row 1 FieldsHeader 1 Fields + Blank Header Dim. Fields + Detail 2 Fields + Detail 2 Dim. Row 2 FieldsHeader 1 Fields + Blank Header Dim. Fields + Detail 2 Fields + Detail 2 Dim. Row 3 Fields

This leaves us with two potential subsets of records within our flattened dataset:

  1. Order header fields plus populated header dimension fields plus blank detail and detail dimension fields
  2. Order header fields plus blank header dimension fields plus populated detail and detail dimension fields

Depending on the report options and the underlying data, this report might encounter many other variations, as well, including the complete absence of dimension rows at either the order header or order detail level.

So what kind of RDLC table structure do we need to accommodate this somewhat complex situation?

First, with the potential for a one-to-many subset of header dimensional records on the same level as the the one-to-many subset of order detail records (i.e. both linked directly to order header records and thus at the same level in the data hierarchy), your gut instinct should tell you that you need two tables, not one.

Listen to your gut and add a new table. Position this new table above your existing table, so it is the first reporting object in the report’s body section.

This new table will contain only a detail row (no table or group header or footer rows) with fields for the order header dimension records, as can be seen here in Microsoft’s RDLC layout for Report 5703 – Transfer Order:

There is no magic to this new table. It simply prints the Header_DimensionsCaption and DimText fields that originate from the classic version of the report. However, you will need to perform a couple of actions to make it work properly in your own report (i.e. in addition to adding the new table and detail row, adding two Textboxes to the detail row, and connecting them to the Header_DimensionsCaption and DimText fields).

First, you need to return to the classic version of your report and add two new hidden TextBoxes:

  1. The first hidden TextBox references the DimensionLoop1.Number data field (i.e. the Number field from the DimensionLoop1 instance of the Integer table).
  2. The second hidden TextBox references the DimensionLoop2.Number data field (i.e. the Number field from the DimensionLoop2 instance of the Integer table).

As you should know by now, the reason we’re placing these hidden Textboxes in sections of our classic report is to ensure that the data fields they reference make it into our RDLC dataset.

What will we do with these data fields on the RDLC side? We’ll use them as follows:

  • On the Visibility tab of your new table, make visibility conditional on this expression:
  • IIf(Fields!DimensionLoop1_Number.Value > 0,False,True)

    This ensures our new table will only show if there is header dimension data.

    • On the table properties’ Filters tab, set the following filter condition:
    • This ensures that only valid header dimension data is processed by the table (i.e. it filters out all records in the dataset that weren’t generated from DimensionLoop1).

      Within the table, you also need to apply a visibility expression to the TextBox in the table’s first column, i.e. the one that displays the Header_DimensionsCaption field. The expression is:

      =IIF(Fields!DimensionLoop1_Number.Value = 1,False,True)

      This ensures that the header dimension caption displays only once, for the first header dimension row.

      The main table in the Microsoft RDLC version of Report 5703 – Transfer Order appears as follows:

      The rows in this table consist of:

      1. A table header row containing the column headings
      2. A group header row containing the transfer order detail fields
      3. A detail row containing the order detail dimension records
      4. A table footer row containing the Shipment Method caption and description

      There is very little RDLC magic to this table, either, as all the data rows/fields already flow through to the RDLC dataset from the classic report. The steps required to recreate this table structure in your own report are:

      1. In the table properties, navigate to the General tab and set the values as shown:

      2. Add a table header row.

      3. Move the column headings from your existing group header row into the new  table header row.

      4. Move the data fields from your existing details row into your existing group header row.

      5. Edit the General tab of the existing group header row to group on Transfer Header No., OutputNo, and Transfer Line No., as shown below:

      By grouping on these three fields, we eliminate the duplicates cause by using a flattened dataset.

      6. Make sure your group is sorted on the same three fields, in the same order, as you’re using for the filter.

      7. Add the following visibility expression to your existing detail row:

      =IIF(Fields!DimensionLoop2_Number.Value > 0,False,True)

      8. Add TextBoxes to the detail row for the transfer line dimension caption and dimension value fields. Note, you will have to merge cells and adjust widths, and of course connect the TextBoxes to their source fields (which may have different names in your report than in ours).

      9. Add the following visibility expression to the Textbox that contains the transfer line dimension caption (whatever its actual name):

      =IIf(Fields!DimensionLoop2_Number.Value = 1,False,True)

       …which ensures the caption will print only once.

      10. Add a table footer row to your main table and move the shipment code fields (caption and value) previously in your group footer row into this new row. Again, you will have to merge cells and make some layout adjustments to get this right.

      11. Delete your old group footer row (which should now be empty).

      You will undoubtedly still have some layout tweaking to do, but if you have implemented all this correctly, your RDLC version of Report 5703 – Transfer Order should now be fully functional. Its output should look like this:

      Next Post

      We have a few more report types we’d like to cover in this series on converting classic reports to RDLC layouts, but we’ve come to realize that by lumping them all together, we’re making it difficult for people to find information on how to convert specific types of reports. So, going forward, were going to address each type of report conversion in its own post or series of posts.

]]>