Personal Insights in Tech

Category: Technical

This categorize encompasses posts going into some moderate level of technical detail to discuss or solve a given problem.

Utilizing Power BI REST API for Government Cloud (GCC)

Problem & Purpose

Recently I had a project come up in my work, basically requiring a way to scrape metadata on Reports and other items my Team has stored in our Workspaces of the Power BI cloud service.


Note before continuing; I work primarily in an Office 365 government cloud (GCC) environment, so my documentation below will be tailored directly for other GCC customers. In my research to get this work over several days, I found most google-able samples and examples to be tailored for the public environment sector of Power BI service, with minimal to no documentation relating to GCC customers..


Image 1. Power BI REST API.

This project prompted my first real look into the Power BI REST API. The Power BI REST API provides service endpoints for embedding, administration, governance and user resources, letting you manage Power BI content, perform admin operations, and embed Power BI content elsewhere on the web.

To make API calls, it is recommended to setup and make authorization through a method called “Service Principal”. The basic idea is to create an authorized generic-entity who has the correct permissions to perform the calls and operations we desire without being tied to an actual personal account of a team member. This setup can be done via PowerShell or the Microsoft Azure Portal, but for this entry we will focus on the few simple steps in the Azure portal.

Setting up Service Principal

1. Create an Azure AD app and new Client Secret.

Though Microsoft documentation tends to switch the terminology, this new “App” in our situation will serve as the “Service Principal” entity for authorization. Make note of your {Application Client ID}, {Directory (tenant) ID}, and {Client Secret – Value} as you follow the linked documentation steps.

Image 2. After setting up new Service Principal / App Registration in Azure Portal.

Note: Remember to copy the {Client Secret – Value} when you create it, as the full value will not be accessible once you navigate away from the screen, and will require you to create a new one to get a full value.

Image 3. After setting up new Client Secret.

2. Create an Azure AD security group

Image 4. After setting up new Security Group to house the Service Principal.

Add the just created App Registration as a Member of the new Security Group:

Image 5. After adding Service Principal to the new security group.

3. Edit Power BI Tenant Settings for API authorization

For the next step, it is necessary for you to have the Fabric Administrator role assigned to your user and active.

Image 6. Confirming the personal user making Power BI tenant changes has necessary role to enable API calls.

Follow Enable the Power BI service admin settings to enable Non-Admin and/or Admin API calls for your Service Principal inside the new security Group. These settings are in the Admin Portal of the Power BI service:

Image 7. Accessing the Admin Portal in Power BI Service.

4. Add the Security Group as Admin to chosen Workspace(s)

Add the service principal (via new Group) to your workspace as an Admin member of the Workspace.


O365 GCC – Power BI REST API call configurations

Now we should have everything setup we need to utilize a Power BI Service Principal and make (Non-Admin/Admin) Power BI REST API calls using it. For testing and executing these API calls, we will need those (3) important values noted earlier:

  • {Application (client) ID} – ex. 43b6…
  • {Directory (tenant) ID} – ex. a8ca…
  • {Client Secret – Value} – ex. Opd*******

For this exercise, I will verify using the Postman API platform directly.


1. Our first step is to create a [Post] call to our tenant utilizing the Service Principal and get an authorized “Bearer” token. First, we setup a [Post] call for an oauth2 authorization to our Power BI tenant:
https://login.microsoftonline.com/{Tenant ID}/oauth2/v2.0/token/

Image 8. Drafting the POST call to generate authorized Bearer token for Service Principal.

2. Next, we form the Body of the [Post] call with a few Key/Value pairs:

Image 9. Adding key-value pairs to the {body} of POST API call.

3. Third, if we have entered all values correctly for the [Post] body key-value pairings, we should get a server “200” OK response and a returned “Bearer” access token value starting in “ey…”. This will allow us authorization as the Service Principal to make API [Get] calls. Be sure to copy the entire “access_token” value from beginning with “ey” to the end of the string.

More on managing access tokens for API requests can be found here.

Image 10. Confirming successful POST API call to generate Bearer token.

4. Now that are Service Principal has authenticated with a valid and active token (for now), we can attempt many possible Power BI REST API calls for data. We will start with a non-Admin API call. Open a new tab in Postman and configure a [Get] API call with the following example URL: https://api.powerbigov.us/v1.0/myorg/groups

Click on the [Authorization] section for this API call, select “Type: Bearer Token”, and paste the full value into the “Token” section.

Image 11. Drafting a sample GET API call utilizing new Bearer token.

This will return all Workspaces/Groups – Microsoft documentation uses the term interchangeably – that the Service Principal via the Security Group has been given access to. In my example below, this comprises currently only a Development and Test workspace.

Image 12. Viewing results of successful GET API call.

Note that the returned “Bearer” token is active by default up to one hour after login or renewal calls, though depending on your security concerns can be reduced to 5 minutes or extended up to 24hrs. If the token is expired when you try to make an API call, you will get a guided response:

Image 13. Showing the guided message if/when token expires and triggering call.

5. If we have enabled Admin read-only API calls through our Power BI tenant Admin Portal for the same Security Group holding our Service Principal, we can also utilize Admin API operations by changing the [Get] call to a URL like the following to get all Reports the authorized Service Principal has access to: https://api.powerbigov.us/v1.0/myorg/admin/reports

Image 14. Sample of making an Admin read-only API call to the Power BI Service.

Conclusion

And that is a quick demonstration of setting up and configuring a Service Principal to utilize Power BI REST API calls for the government cloud (GCC).

In my next post I will cover how we can utilize these API calls with parameters in Power Query to pull this metadata into Power BI for ETL operations and utilization.

Using PnP.PowerShell to retrieve SharePoint Online List Data

The Problem

Working (at this time) with Power BI Report Server, my team and I occasionally encounter hurdles with getting data to automatically refresh from certain sources. In this case, we were unable to directly connect and refresh data from our organizations SharePoint Online site list. In this post I detail how I use the PnP.PowerShell module to retrieve SharePoint Online site list data.

The Solution

After a few developmental trials, I crafted the following PowerShell Script (sample) which utilizes the local logged-in User credentials and PnP.PowerShell module to easily retrieve data from SharePoint Online site lists. As long as the local user has access to the site list in question, and the module is installed on the machine, developers can simply paste this script into PowerShell ISE on their own machine, and configure to their own site list(s) and data column needs.

The script upon execution briefly pops up with a dialog passing local credentials to the SharePoint site in question, and then shows a progress bar as data is exported and transformed, closing when finished:

Progress bar of data export progress from SharePoint Online site list data.

This script can thus be run on command to refresh the data as often as needed, or saved to a specified location and set up in Task Scheduler to run automatically:

Automatically run script each business day at 8:30AM.
Settings in Task Scheduler to run the specified .ps1 PowerShell Script.

Future Plans

My team and I plan to pursue changing this operation to occur through development of an SQL Server Integration Services (SSIS) package, but at this time there are security concerns with utilizing Visual Studio 2019 / .NET 5 (out of support), and Visual Studio 2022 has not yet added development features for SSIS – an irksome problem for myself and many other developers since October 2021; more details on this here.

Conclusion

This post has covered in detail how I use the PnP.PowerShell module to retrieve SharePoint Online site list data. To see other PowerShell scripts I have developed, see my other PowerShell blog posts, or my PowerShell Operations GitHub repository. Thanks for reading!

SCRIPT:

# Script:        Get SharePoint Online List Data
# Developer:     James Scurlock

#
# Description:   Ran from Developer machine in PowerShell ISE, uses local credentials to access a SharePoint Online Site,
#                and retrieve data in a given List for export to .csv/.xlsx. Intention is for this data to be regularly pulled to refresh
#                a Power BI Report based upon said data.
#
# Last Modified: 7/6/2022
#--------------------------------------------------
# PRE-REQUISITE(S): 
#--------------------------------------------------
#    1. Have Access to the SharePoint Site List in question
#    2. Install-Module -Name PnP.PowerShell
#--------------------------------------------------
# DEVELOPER NOTE(S):
#--------------------------------------------------
#    PnP PowerShell Documentation: https://pnp.github.io/powershell/cmdlets/Connect-PnPOnline.html#interactive-login-for-multi-factor-authentication
#--------------------------------------------------
# CONFIGURATION PARAMETER(S):
#--------------------------------------------------
$siteURL = "https://{yourDomain}.sharepoint.com/sites/{nameOfSite}"
$listName = "{nameOfList}"
$exportFileLocation = "\\fileServer\ITGroup\DBTeam\SharePointExportedData\"
$CSVPath = $exportFileLocation + "Exported Data.csv"
#$ExcelPath = $exportFileLocation + "Exported Data.xlsx"

# EDIT on Line 62 as needed for fields to be renamed upon Export.
$propertiesToExport = @("ID","Title","field_1","field_2","field_3"); 
#--------------------------------------------------
# BEGIN SCRIPT
#--------------------------------------------------
try 
{
	# Check if Module is installed on current machine
    if (Get-Module -ListAvailable -Name PnP.PowerShell) {
        $dataForExport = @()
        cls
    
        # Connect-PnPOnline: https://pnp.github.io/powershell/cmdlets/Connect-PnPOnline.html
        # INFO - The following prompts for credentials, but Fails with either 403 Forbidden or non-matching name/password in Microsoft account system. Will not work if Multi-Factor Authentication (MFA) is enabled.
        #    Connect-PnPOnline -Url $SiteURL -Credentials (Get-Credential) # 
        # INFO - Non-Admin: The following returns "The sign-in name or password does not match one in the Microsoft account system - assuming also not working due to Multi-Factor Authentication.
        #        Admin: AADSTS53000: Device is not in required device state: compliant. Conditional Access policy requires a compliant device, and the device is not compliant. The user must enroll their device with an approved MDM provider like Intune.
        #    $creds = (New-Object System.Management.Automation.PSCredential "<<UserName>>",(ConvertTo-SecureString "<<Password>>" -AsPlainText -Force))
        #    Connect-PnPOnline -Url $siteURL -Credentials $creds
        # Ex. 9 - Creates a brief popup window that checks for login cookies. Can use -ForceAuthentication to reset and force a new login.
        #         NOTE: This is required for systems utilizing Multi-Factor Authentication.
        Connect-PnPOnline -Url $siteURL -UseWebLogin 

        $counter = 0
   
        #$ListItems = Get-PnPListItem -List $ListName -Fields $SelectedFields -PageSize 2000 # PageSize: The number of items to retrieve per page request
        #$ListItems = (Get-PnPListItem -List $ListName -Fields $SelectedFields).FieldValues
        $listItemIDTitleGUID = Get-PnPListItem -List $listName #  = Id, Title, GUID

        foreach ($item in $listItemIDTitleGUID)
        {
            $counter++
            Write-Progress -PercentComplete ($counter / $($listItemIDTitleGUID.Count)  * 100) -Activity "Exporting List Items..." -Status  "Exporting Item $counter of $($listItemIDTitleGUID.Count)"

            # Get all list items specificed properties by ID using PnP cmdlet - https://pnp.github.io/powershell/cmdlets/Get-PnPListItem.html
            $listItems = (Get-PnPListItem -List $listName -Id $item.Id -Fields $propertiesToExport).FieldValues
    
            # Add the Name / Value pairing of each property for export to the export data array. Adding and removing some properties by name to specify the actual field designation upon Export.
            $listItems | foreach {
                $itemForExport = New-Object PSObject
                foreach ($field in $propertiesToExport) {
                    if ($field -eq "field_1") {
                        $itemForExport | Add-Member -MemberType NoteProperty -name "field_1" -value $_[$field]
                        $itemForExport | Add-Member -MemberType NoteProperty -Name "ProjectSponsor" -Value $itemForExport.field_1 -Force
                        $itemForExport.PSObject.Properties.Remove("field_1")
                    } else {
                        if ($field -eq "field_2") {
                            $itemForExport | Add-Member -MemberType NoteProperty -name "field_2" -value $_[$field]
                            $itemForExport | Add-Member -MemberType NoteProperty -Name "InitiativeKey" -Value $itemForExport.field_2 -Force
                            $itemForExport.PSObject.Properties.Remove("field_2")
                        } else {
                            if ($field -eq "field_3") {
                                $itemForExport | Add-Member -MemberType NoteProperty -name "field_3" -value $_[$field]
                                $itemForExport | Add-Member -MemberType NoteProperty -Name "AreaOfFocus" -Value $itemForExport.field_3 -Force
                                $itemForExport.PSObject.Properties.Remove("field_3")
                            } else {
									$itemForExport | Add-Member -MemberType NoteProperty -name $field -value $_[$field]
								
							}
						}
					}
				}
		
                # Add the object with the above updated properties to the export Array
                $dataForExport += $itemForExport
            }
        }

        #Export the result Array to csv/xlsx files
        $dataForExport | Export-CSV $CSVPath -NoTypeInformation
			#$dataForExport | Export-Excel -Path $ExcelPath

        # Disconnect the current connection and clear token cache.
        Disconnect-PnPOnline
    } else {
        cls
        Write-Host "In order to execute this script, the Cmdlet Module {PnP.PowerShell} must be installed. Please open an Admin shell and run the following command: Install-Module -Name PnP.PowerShell"
    }
} catch {
    Write-Host $_.Exception.Message
    $Error.Clear()
}
#--------------------------------------------------
# END SCRIPT
#--------------------------------------------------

Using PowerShell to retrieve Azure DevOps Work Item Data

This PowerShell Script and any future enhancements can be found in my Public GitHub repository – Azure DevOps Work Item Data Extraction.ps1

Sample of Script to retrieve Azure DevOps Work Item data.

Working in a Microsoft centric ecosystem, my team at this time utilizes Azure DevOps. For a Reporting project the other day, I had a request where alongside creating a dashboard for data scraped from a 3rd party tool’s API, business users wanted to incorporate data from a particular Azure DevOps Board, to review how quickly certain related Work Item types were being resolved over time.

The below PowerShell script documents how this task was acheived, outputing the desired data as a .json file for modeling in Power BI, and is relatively simple overall – in my opinion.

We utilize Work Item Query Language (WIQL) to get the fields we care about through the DevOps REST API call.

Steps:

  1. Navigate to the Project of interest in your Azure Devops. Make note of your {organization}, {project}, and {team} in DevOps. These strings will need to be added to the below script as variables.
  2. In the top-right, select the User Settings icon and go to Personal access tokens. Assuming you have never set one up before, create a new token. For this example, I have given the access token a name, selected my {Organization}, specified access as being Full, and custom defined it to expire the maximum number of days I can specify (1 year). You can customize this to better fit your security and expiration preferences.

    Finding Azure DevOps Personal Access Tokens

    Creating new Personal Access Token

    Settings for Personal Access Token

    Make note of this token ID as once you leave the screen you cannot retrieve it again, and thereby must create an entirely new token.
  1. Paste the token value and other appropriate strings into variables. Paste the copied token value into the {$token} variable, and paste your noted {$organization}, {$project}, and {$team} strings into their appropriate variables. Specify where the exported data should be saved to in the {$exportFileLocation} variable. The example below utilizes my Downloads folder.
  2. Build your data query criteria. In the {$body} variable, build out the criteria that determines what work items you want data for. In my case I was looking for work items with the custom type of “Vulnerability” that had been completed, or, in state of “Done”.

    A quick note on this:

    The WHERE clause is the important part of the query inside the {$body} variable. By REST API design, the return of this data call will only contain the Work Item ID, and the Work Item URL – regardless of what columns you specify in the SELECT clause.
  1. Elect field data columns to export. Customize the {$customObject} and {$exportData} variables to output the fields you wish to export for found work items matching your query. Working inside PowerShell ISE, auto-complete helped me discover a number of available fields, both default and custom-defined in my custom work item type.

And that is pretty much it.

Assuming the user you run this script as has full access to the work items you are trying to query, each work item matching your criteria will be looped through to retrieve details and then exported as a .json file to your specified location.

This PowerShell Script and any future enhancements can be found in my Public GitHub repository – Azure DevOps Work Item Data Extraction.ps1

CHALLENGE

One notable challenge I ran into with this project is that the v6.0 of this DevOps REST API document mentions you should be able to query multiple work items at once and, retrieve the field data for each. This was not something I could get to work, as each call returned empty field data: “fields=;”

Working: $url = “https://dev.azure.com/$organization/$project/_apis/wit/workitems/12345?`$expand=All&api-version=5.0”
Not Working: $url = “https://dev.azure.com/$organization/$project/_apis/wit/workitems?ids=46477,43819&api-version=6.0”

I will update in the future if I figure out how to get this to work, but for now simply iterate through each returned work item matching query criteria, and 1 ID at a time retrieve the field data therein.

# Script:        Get DevOps WorkItem Data
# Last Modified: 4/10/2022 3:30 PM
#
# Note: If copying from Repo, add in your Personal-Access-Token (PAT) and sepcify export file location.
#--------------------------------------------------------------------------------
# 1. Use WIQL to get a queried for list of Work Items matching certain conditions
#--------------------------------------------------------------------------------
$token = "xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx"

$organization = "AnalyticArchives"
$project = "ProDev"
$team = "ProDev Team"

$exportFileLocation = 'C:\Users\{YourUser}\Downloads'

$codedToken = [Convert]::ToBase64String([Text.Encoding]::ASCII.GetBytes(":$($token)"))
$AzureDevOpsAuthenicationHeader = @{Authorization = 'Basic ' + $codedToken}

$uri = "https://dev.azure.com/$organization/$project/$team/_apis/wit/wiql?api-version=6.0"

$body = @{
    #NOTE: Ignore the specified columns, only returns ID and URL by design
    "query" = "Select [System.Id], 
                      [System.Title], 
                      [System.State],
                      [Custom.FirstDetected] 
                      From WorkItems 
                      Where [System.WorkItemType] = 'Vulnerability' 
                      AND [State] = 'Done'"
} | ConvertTo-Json 

$response = Invoke-RestMethod -Method Post -Uri $uri -Headers $AzureDevOpsAuthenicationHeader -Body $body -ContentType 'application/json'

#--------------------------------------------------------------------------------
# 2. Use returned list of Work Item ID's to get field Data
#--------------------------------------------------------------------------------
$exportData = @()
$tempID = 0
foreach ($item in $response.workItems) {
    $tempID = $item.id
    $url = "https://dev.azure.com/$organization/$project/_apis/wit/workitems/$($tempID)?`$expand=All&api-version=5.0"
    $response = Invoke-RestMethod -Uri $url -Headers $AzureDevOpsAuthenicationHeader -Method Get -ContentType application/json 

    foreach ($field in $response) {

        $customObject = new-object PSObject -property @{

            "WorkItemID" = $field.fields.'System.Id'
            "URL" = $field.url
            "WorkItemType" = $field.fields.'System.WorkItemType'
            "Title" = $field.fields.'System.Title'
            "IterationPath" = $field.fields.'System.IterationPath'
            "State" = $field.fields.'System.State' 
            "AssignedTo" = $field.fields.'System.AssignedTo'.displayName
            "CoAssigned" = $field.fields.'Custom.CoAssigned'.displayName
            "FirstDetected" = $field.fields.'Custom.FirstDetected'
            "DueDate" = $field.fields.'Microsoft.VSTS.Scheduling.DueDate'
            "Priority" = $field.fields.'Microsoft.VSTS.Common.Priority'
            "Effort" = $field.fields.'Microsoft.VSTS.Scheduling.Effort'
            "ChangedDate" = $field.fields.'System.ChangedDate'
            "ChangedBy" = $field.fields.'System.ChangedBy'.displayName
            "ParentWorkItem" = $field.fields.'System.Parent'
            "RelatedWorkItems" = $field.relations
        } 
    $exportData += $customObject      
    }
}

$exportData | Select-Object -Property WorkItemID,
    URL,
    WorkItemType,
    Title,
    IterationPath,
    State,
    AssignedTo,
    CoAssigned,
    FirstDetected,
    DueDate,
    Priority,
    Effort,
    ChangedDate, 
    ChangedBy,
    ParentWorkItem,
    RelatedWorkItems

$exportData | ConvertTo-Json | Out-File "$($exportFileLocation)\DEVOPS_EXPORT.json"

Data Modeling best practices in Power BI

Summary: Today I experienced a good case study proving the benefits and importance of utilizing data modeling best practices, such as a star schema, in Power BI. Also, how this aides in relation to report size, refresh time, and performance; even on a rather simplistic source query.

Below I detail the original situation, and how much improvement is seen when switching to a star schema. Various segments have been redacted for privacy and security, and should not affect reader ability to follow.

This write-up assumes you have some familiarity with the concept of a Star Schema in Power BI utilizing Fact and Dimension Tables, and the Power BI Desktop tool.


Situation

Utilizing newly learned techniques from various training courses and resources, my team has been striving to review our existing Power BI Reports for improvement.

One point of focus for us are a few reports which seem to have disproportionately long daily refresh times. We note these issues in a custom Admin report we have designed to give us metrics on each report over time:

Admin Report, examining refresh times for various Power BI Reports.

Focusing on our report in question, we see this report is averaging over the last 2 weeks approximately 15-16 minutes to refresh data. This seems excessive on the surface, given how rather simple the source query is.

Examining our “Long Refreshing Report” in particular, we see it averages 15 minutes to refresh.

Before we begin making any changes, note the current file size is 10.8 MB. We do not concern ourselves with the visual elements (3 slicers and a 2-layer matrix), as we are focused on the data model itself and data load times.

The [redacted] query below is our singular source query in this report, and the related Applied Steps in the Power BI data model, which currently take 15-16min for the report to refresh source data:

Various applied steps in our query.
Redacted overview of our existing main query.

Running the query in Oracle SQL Developer, we see there are approximately 2.3 million rows in this result set:

Row count of original query.

Updates

Following normal procedure for changing the data model into that of a star schema, we modify our original source query  to:

  • Remove duplicate, unnecessary columns like [code_value] on lines 2 and 17.
  • Trim case logic down on [Fund] lines 5-8, to just get the [fund_id], which we will link to a dimension table for retrieving those appropriate values.
  • Trim case logic down on [classification_type] lines 19-22 for just the [classification_id], linking to another dimension table for the full text.
     
  • Note: In our case, it was not necessary for this data to be individual-specific and include unique personal id’s. We grouped the data up to a summary level from the source SQL directly instead of relying on Power BI to do this for us in visuals. We also modified our WHERE statement check on the date from using a specific date that is greater than equal a static value, to always returning 2 years’ worth of data as observed from January 1st of the current year; including up to the current date.
Old way of date checking the data.
Updated way of checking the date for results.

All these changes together result in the below main query structure. These changes make it cleaner, and extrapolate dimensional data that we’ll need out into other queries/tables. Otherwise, these dimension values must be replicated in each and every one of our fact table rows.

Updates made to source query simplify the query and reduce result set.11

These changes take us from 2,329,236 rows in our main fact table query, down to just the 4,024 rows worth of data we need.

This updated source query becomes our Fact Table, factTable1, now with only a singular applied step to change the DateTime field to a Date:

Applied Steps needed after updating our source query.

We then create Dimension tables for the:

  • Funds (dimTable1) containing 2 columns and 2 rows.
  • Source types (dimTable2) containing 16 rows and 4 columns.
  • A custom defined Date table (dimTable3), containing the [full_date], [year], [month_long], [month_num] and [fiscal_year] for every date we return in our main query for the fact table. These fields are pulled from a re-usable and easily edited View we maintain in our database.

Shown below, we keep this dimension table filtered using List.Max() and List.Min(). In this manner, we keep our date table

Advanced Editor steps showing how the updated query for Fact Table is arranged.

After closing and applying our Query changes, there are 2 things to keep in mind with this approach:

  1. First, we mark our Date dimension table explicitly as a {Date Table} on the Data Modeling tab in Power BI.

    This lets Power BI know this table is to be used in relation to all date fields.
  2. Remember to wire up all your Dimension tables to your Fact table:
New data model splitting out Fact Table and Dimension Tables.

From here, we simply update our visuals to pull values from newly created dimension table fields.

Conclusion

With visuals now updated, we save the report and trigger a one-time refresh on the power bi report server.

New refresh time after making query and data model changes is far reduced.

And there we have it!

With these minor changes we see vast improvement, including easier maintainability and faster reload from 16 minutes to 1 minute. The report is also now only taking up 124kb of space on our report server as opposed to 10.8MB. Thanks for reading, and hope this helps someone else in the future like it has me.

My Tools (2021)

I wanted to take some time to go over the varied software tools I have made use of both previously as a Software Engineer and currently in my role as a Data Analyst – several of course, overlap!

Note that a large portion of my career has been in a Microsoft-centered IT department, aside from our Oracle back-end.

Starting more in the role of a Software Engineer, previously I used the following in my day-to-day:

Having served so long on the Development Team as a Software Engineer, I still use a lot of the same tools as the need arises. This saves time waiting for someone else to find an answer I can readily get to myself, and insight into a lot of our common business workflows.

Now as a Data Analyst however, I spend a lot more time using the following:

And of course there are several I have on my to-do list to investigate further in the future:

Powered by WordPress & Theme by Anders Norén