SharePoint slow? SharePoint Online or on-prem troubleshoot slowness issues

As more and more organization are moving to the cloud on the conferences I have been and with clients I have discussed, I hear quite often that they experience SharePoint performance problems or SharePoint slow responds back, either the page loads to slow, either it happens regularly during the day for no visible reason. It just happens sometimes, it is there for a day or two and then it suddenly disappears and then suddenly it is back. And it is frustrating because by the time you start troubleshooting the issue may go away.

When you troubleshoot on-prem environment it is quite easy sometimes. You know how the SharePoint farm looks like, it is always the same set of a different servers. We have web front servers where users are consuming content and connecting, we have application servers where your important services are running, and you have back-end SQL servers where all the data gets stored. If on-premises is slow you will monitor all server roles, the network and then you will figure out that your CPU is under heavy load, you have assigned too few memory resources or just underlying storage system is slow and SharePoint page loading for the end users appear slow.

In the Office 365 cloud the situation is completely different – you do not have control over the resources, you can control just how the environment will look like logically, e.g. how many site collections you will have, subsites, permissions etc.

You don’t know how many WFE servers you have in the Office 365 cloud, how many application servers are running in the background and if the SQL database is on the shared machine, how many RAM you have on it and how many end users that SQL box is hosting and if it a shared or dedicated instance.

SharePoint Online is built with multitenancy in mind from the scratch. There is a reference on the Microsoft docs on how to build SharePoint farm that is fictionally built for 300.000 users.

This is not your SharePoint Online farm, well the design is quite similar, but everything is built on the completely different scale. You don’t have 10 WFE servers, you have hundreds of those, you have HA (between multiple farms), you have redundancy across different GEO located data center and the bottom line would be it is built on way larger scale then the on-premises SharePoint farms.

The load testing of the SharePoint Online tenant is prohibited as per this article: https://docs.microsoft.com/en-us/office365/enterprise/capacity-planning-and-load-testing-sharepoint-online

All the results you might get from the load testing are temporary and you cannot use them as a reference because the results may vary a lot. You probably heard quite a few times that Microsoft is automatically throttling load tests meaning that the results are not real and might mislead you.

So, the approach to the SharePoint online monitoring will be completely different. We will use a set of tools (mainly browser) that will be able to extract information from the end user’s perspective and we will try to extract some valuable data from those numbers that will help us lead in the right direction to detect what could possibly go on.

How SharePoint Online is delivered to you and your end users

I will just briefly explain how SharePoint Online works in the backend network and how content gets delivered to the users that are sitting in different GEO regions.

This picture represents how is Microsoft globally through the Azure network bringing SharePoint online to you.

So this effectively means that Azure traffic between the datacenter stays on the Microsoft network and does not flow over the internet. You first connect to the edge node and then from the edge node you go in the Azure network where Microsoft routes your request thru the ultra-fast, ultra-reliable Microsoft global network. Microsoft has invested in the dark fiber and in last three years the Microsoft long -haul WAN capacity by 700 percent.

Fact: Microsoft owns and runs one of the largest WAN backbones in the world.

The image below will illustrate how two users are connecting from the end-point over the edge node to the Azure network and then to the destitution inside the Azure network.

US user who is sitting in San Diego will connect first to the edge node in Los Angeles and then thru ultra fast Azure internal network will continue to the destination farm sitting in North Central US data center.

The UK user will not go thru the long-haul TCP connection over the Atlantic Ocean, rather the user will connect to the edge node in London and then thru the Microsoft Azure network it will access the North Central US data center and your SharePoint Online site living there.

It is important to mention that Azure traffic is not flowing thru the public WAN but always stays within the Azure network. All the Azure traffic between the WFE, APP, Search and all that between the SQL servers always stays on the Microsoft private Azure network, regardless of the source and destination region.

 

Troubleshooting with Chrome DevTools

For the troubleshooting I will use Google Chrome. Most of the browsers today have the developer console where you can check the details that will be important for troubleshooting why SharePoint is slow.

I will use our demo tenant for troubleshooting and it looks like this:

On the Google Chrome the DevTools are activated with F12 like with the most modern browsers.

When you press the F12 you need to go to the network tab. When you are on the network tab press ctrl + f5 to reload the page. Now the key here is to select first aspx page because that is the SharePoint page that will show us the data that is important for troubleshooting. When you select SharePoint aspx page in the right window where the headers are you need to scroll a bit below where you will find the headers that we want.

The names are and short explanations:

  • Spiislatency – is the time in milliseconds taken in the front-end Web server after the request has been received the front-end Web server, but before the Web Application begins processing the request. This value should be around 0 or 1 or very close to that numbers. I have seen this request spikes to the two digit numbers but very rare or in one request and then goes back to the single digit numbers.
  • Sprequestduration – is the time in milliseconds that took to process the request on the server. So bottom line is this is the end to end processing time for the SharePoint page. Healthy pages have couple of hunderds for this number so in a range from 100-500ms, if it is larger than that you might experience some issues. If this number is high the last number x-sharepointhealthscore will be high as well.
  • Sprequestguid – this is basically correlation ID and we will use it for troubleshooting if we need to report slowness to the Microsoft. This ID will be required by the Microsoft support, because otherwise SharePoint online is slow doesn’t  work for them
  • x-sharepointhealthscore – this is the indication of how heavily loaded is SharePoint from which your request came. The value for this is from 0 – 10, where 0 or very close to 0 is great, and everything larger then   indicates performance problem. If this number is constantly over 5 this is the indication that something bad is going on with the farm where your tenant is any that you need to report this back to the Microsoft.

If you don’t see the Spiislatency and Sprequestduration then tough luck, Microsoft disabled this headers in order to aggravate your troubleshooting 😉 Just kidding if you don’t see this headers we will describe down below on how to extract it if you don’t see it. Just for some SharePoint Online tenants this numbers are not visible, and you need to do extra steps in order to extract the numbers.

So sometimes these headers and on some tenants are not available and the headers look like this. To be honest on the test tenants I have some have these headers some don’t. Why this happens we don’t know and there are numerous articles on the web that a lot of people noticed the same. Probably in the future this will be consistent, until then you have a solution on the end of the article that you can check out later.

Page Diagnostics for SharePoint

Page diagnostics is a Chrome extension that can be installed from the web store to troubleshoot your SharePoint Online pages. You can download it from here.

Page Diagnostics for SharePoint is a simple to use tool that can be installed in the browser and can provide you simple basic troubleshooting for your slow SharePoint sites. Hopefully Microsoft in the future will expand the use cases for the tool.

 

The has very limited use case but it will help you troubleshoot:

  • Non-SharePoint system pages like allitems.aspx, it will run on the Site pages
  • It runs on the classic sites now (hopefully modern will be added sometime)

Main things this Chrome extension checks are:

Please note: Page Diagnostics Tool for SharePoint extracts the headers SPRequestDuration and  SPIISLatency from the SharePoint Online page. If this two headers are empty then unfortunately this values are not available thru the standard APIs in your tenant. More on the solution on the end of the article.

SysKit Insights

With this being said I am proud I was part of the team who built tool that will basically allow you to detect if your SharePoint slow to respond. We managed to built even engine the extracts SPRequestDuration and SPIISLatency when those are not available like on some SharePoint tenants or SharePoint modern sites.

The solution is capable out of the box of reading following metrics:

  • Uptime of the web page
  • X-SPHealthScore
  • SPRequestDuration
  • SPIISLatency
  • Page Response Time or how much SharePoint page takes to load

The following environments are currently supported :

  • SharePoint Online modern and classic sites
  • All SharePoint on premise environments from 2010-2016
  • SharePoint 2019 classic and modern sites

On top of this metrics the tools will provide a file request drill down and how much each request and the file took to load. If there are huge background images you will be first to know.

If you need to report back to Microsoft the corellationID for the troubleshooting (or so called SPRequestGuid) the tool will extract it for you, so you can report it to Microsoft for easier troubleshooting of the issue. To check out the tool click here: https://www.syskit.com/products/insights/download/

How is your experience with the SharePoint Online slowness troubleshooting? What tools do you use? Leave in comments below I am quite interested in understanding this patterns in the SharePoint Online.

Posted in SharePoint

How to make Firefox run faster a.k.a. run it in single process (firefox is slow)

Chrome started to give me troubles on multiple webistes, because I would in order to work I needed to hard empty cache for every visit to the website so I have tried firefox once again after a while to check if it works or is it better than the chrome.

On the first run the speed of the firefox browser was more or less the same as the chrome. Then after a while I started seeing that firefox is slow, beacause I have a tendency to open quote a lot new tabs the number sometimes varies from 30 to multiple windows with 20+ tabs :D

What I saw in this usage is that firefox is using more than double memory of the Chrome, so if chrome in total has used 4Gigs the firefox usage would be 8Gigs+. This is not so acceptable to me as my laptop has 16Gigs of RAM so it would fill quite quickly if I open something esle as well.

After looking around I have found two options in chrome about:config that helped me, basically it means to run the firefox tabs in the single process and not in multiple like default setting. This helped me to decrease the amount of memory used from 6-8gigs to around 2-3 gigs.

Try it your self:

  1. in the firefox address type about:config
  2. in search enter string autostart
  3. you will find three settings, browser.privatebrowsing.autostart, browser.tabs.remote.autostart and browser.tabs.remote.autostart.2.
  4. Set all three to false (basically autostart.2 is as I understand for firefoxe’s lower then version 58 so if you are using it this will be the settings for you).
Posted in Uncategorized

SharePoint audit logs

Intro

SharePoint audit logs allows your organization to see who is accessing what files and folders, site collection, sites and subsites, document libraries, lists or list items you name it. In case you need to audit who did what on the SharePoint site for various reasons like for the compliance projects like GDPR, for the external auditor, tax audits or just because your organization policy is to audit everything you don’t need to look anywhere else. In today blog I will cover what is auditing on the SharePoint, where you need to enable it, and how to automate auditing to be enabled on all the SharePoint site collections within your farm (basically the reason to write this blog post is because of that, I was not able to find how to enable for all site collections on the SharePoint farm).

The article was built for SharePoint 2016 but I believe it works on SharePoint 2013 and SharePoint 2019 auditing works the same as well. If something is changed meanwhile please let me know in the comments below.

Where to enable Audit Settings on the farm level

  1. Go to the Central Administration
  2. Go to Application Management then  Manage Service Aplications
  3. Select the Secure Store Application
  4. On the ribbon, click ‘properties’
  5. In the ‘enable audit’ section; click to select the audit log
  6. To change the number of days that entries will be purged from the audit log, specify a number in days – default is 30

Where to configure audit settings for the site collection

You need to go to site settings > site collection audit settings.

There we can configure multiple stuff.

Audit Log Trimming

If you want to automatically trim audit log and after how log you can trim it here. The first setting enables trimming and second allows you to configure after how log to trim the log. If you leave it empty it will use the farm setting. Default end of the month means the change log retains data for 60 days. That means then logs will retain for two months. Recommended is not to set this setting on the site collection but to use the farm wide setting.

Documents and Items and Lists, Libraries, and Sites

Now we need to to set events to audit for various types. Following events are available:

  • Opened and downloaded documents, viewed items in lists, and viewed item properties (BTW N/A for SharePoint Online sites)
  • Edited items
  • Checked-out and checked-in items
  • Items that have been moved and copied to other locations in the site collection
  • Deleted and restored items
  • Changes to content type and column
  • Search queries
  • Changes to user accounts and permissions
  • Changed audit settings and deleted audit log events
  • Workflow events
  • Custom events

Depending on what you need to audit you can enable partially settings, but usually people want to audit everything so let’s check all the boxes.

FYI you need to be careful with the first option on the huge farms. Opened and downloaded documents, viewed items in lists, and viewed item properties can cause quite a lot impact on the SQL server and the size of the table where SharePoint is storing the logs.

Every SharePoint environment is different any you need to track how your database grows over time and if you can support this amount of logs.

PowerShell to enable all audit settings on all site collections

So manually enabling auditing on each collection would be PITA, because you need to manually go to each site collection, open site collection audit settings and click all 8 check-boxes. That is the reason why I have created a PowerShell that discover all the site collections and enables all auditing on all of them.

FYI I’ve seen farms with as much as 200k+ site collections so imagine if you need to do this manually :D

This PowerShell will run with elevated privileges and on all SharePoint site collections will enable all the auditing settings.

[Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
{
   $sites = Get-SPSite -Limit All
   foreach($site in $sites)
   {
       $site.Audit.AuditFlags = $site.Audit.AuditFlags = [Microsoft.SharePoint.SPAuditMaskType]::CheckOut `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::CheckIn `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::View `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::Delete `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::Update `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::ProfileChange `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::ChildDelete `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::SchemaChange `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::SecurityChange `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::Undelete `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::Workflow `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::Copy `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::Move `
-bxor [Microsoft.SharePoint.SPAuditMaskType]::Search
       $site.Audit.Update()
       $site.Dispose()
   }
})

Where to View and Generate SharePoint Audit Logs Reports

When you have configured audit logs you can find them in Site Settings > Audit Log Reports.

Once you configure the audit logs you can view them in Site Settings. Under Site Collection Administration you will find the Audit Logs Reports option. This is really simple you just need to select what type of the audit events want and SharePoint will generate excel file for you.

In the file you will have two sheets, the first will be the count of the event your selected for all the documents and on the second screen you will have all the users that accessed the document. Plain and simple.

The PITA is that you need to do this for every different event and for every site collection so that can be again time consuming.

PowerShell to automate viewing audit logs from all site collections

So manually generating excel files for each site collection is a lot of the manual job so I decided to ease your life with the PowerShell.

Add-PSSnapin Microsoft.SharePoint.Powershell -ErrorAction SilentlyContinue
[Microsoft.SharePoint.SPSecurity]::RunWithElevatedPrivileges(
{
  $sites = Get-SPSite -Limit All
  foreach($site in $sites)
  {
      $auditQuery = New-Object Microsoft.SharePoint.SPAuditQuery($site)
      $auditLogs = $site.Audit.GetEntries($auditQuery)
      foreach($logEntry in $auditLogs)
      {
        $user = $site.RootWeb.SiteUsers.GetByID($logEntry.UserId).Name
        Write-Host "Document: " $logEntry.DocLocation " Event: " $logEntry.Event " User: " $user " Details: " $logEntry.EventData
      }
  }
})

This is example that I generated on my test farm for all the site collections:

SharePoint audit logs

SharePoint audit logs

I hope you guys now understand how to set audit logs on the farm wide level and for each site collection out there. Let me know in the comments below what do you think about it and how do you audit your farm.

BTW if you want professional looking solution, where you can manage multiple SharePoint farms audit logs, and you want exportable report that is professional looking you want to look at the SPDocKit solution. SPDocKit is easily installed and you can use it to offload your audit logs from the farm and keep logs for longer periods. When you offload the logs, the audit table will be a lot smaller and if you need to search thru the audit logs, or provide access to the auditor from few months ago you can use the SPDocKit because it will hold the logs practically forever. 

Posted in SharePoint Tagged with:

Remote Desktop Services Manager 2016

Are you missing good old terminal services manager (remote desktop services manager) from the Windows Server 2008 R2? For reason that is unknown to me as of today the Microsoft has decided to remove this mmc snapin that was a quick management tool, if you needed to kill the process on the specific server or check the users currently logged on the servers. As of today the Microsoft has not provided any official replacement for this handy tool.

That is pity because I have used the tool numerous times.

So I started researching if I can and to me seems that the tool from Windows 2008 R2 works on the Windows 2012, 2012 R2, 2016 and Windows 10 as well! :-) So everything you need to do is to copy file from here: tsadmin and do following steps:

  1. The zip consist 4 files, tsadmin.msc, wts.dll, tsadmin.dll and tsadmin.reg that I have created for this experiment to work.
  2. extract the files to c:\Windows\System32\
  3. Double click on the tsadmin.reg to add to the registry required information for the terminal services manager in order to load the MMC snapin
  4. Double click on the tsadmin.exe
  5. voila it works!
  6. The best is if you add more servers under mygroup, when you run it again it will just work and read the servers you have added before 

So what you can do? I have tested to work with:

  • Windows server remote desktop services 2012
  • Windows server remote desktop services 2012 R2
  • Windows server remote desktop services 2016
  • Windows server remote desktop services 2019 build 17623 (at a time of the article the Windows 2019 RTM was not announced yet)

The functions working are:

  • disconnecting a session
  • sending message
  • resetting session
  • status of the session
  • logging of the session
  • ending a process on the processes tab (one of the simplest most important features of the tool) 

 

Let me know in the comments is it working as it should for you as well?

BTW if you need powerful full blown management tools you can take a look at SysKit Monitor.

Posted in Remote Desktop Services, Windows general

SQL Server Analysis Services Flight Recorder – how to check the change logs as a PRO

So you have assigned weekend DBA role and you need to take care about your company SQL servers as well. I will focus here on checking the change logs for the SQL server analysis services, it some other blog post we will pay attention to the SQL server database service as well.

The specific problem that I have is on the specific role for the database the cell data read permission were enabled. I was testing some third party app and the third party app was relying on this type of permissions, so of course if you do not assign appropriate permissions for the cell data the application would work and would return something like: “read access to the cell is denied“.

The error has lead me that the read permissions were enabled on the cell data level and the user is not able to read the data from particular place because cell data was not configured.

I have disabled enable read permissions for the cell data in the Role properties for a specific role, and of course the soon I did that the 3rd party application started working, I knew I need to add another property in order application to work.

What I didn’t know is what exactly was in that property before, so when I just ticked enable read permissions and saved the query and lost the query whatever was before. Okay so I have now enabled read permissions of all the content but I know I need to specify just specific content.

I need to know what was there before the save.

First you need to find the logs of the Analysis services which are located here: C:\Program Files\Microsoft SQL Server\MSAS11.MSSQLSERVER\OLAP\Log (of course if your installation is in the different place you need to look there, but to find where are the logs go you can chenk in the Properties of the SQL Analysis server and then go to general tab

Then navigate to that location any you will see a files that looks something like this:

The key for us is the Flight recorder back, so it is like a ‘black-box’ trace of what has changed on the SQL server, or in this scenario of what has changed on the SQL Server analysis services. This is a great this because it will trace all the recent changes on the server, and if you deleted something accidentally you will be able to see what was there before (the settings not the actual data) and then to revert changes on the SQL server.

When you open the flight recorder file it will look something like this:

So inside this file is the trace of everything that you have recently changed on the SQL server. This is really powerful, if you deleted something or if you need to check the audit log or what happens on the query begin, on the query end or if you query subcube everything will be in one single location.

What I knew is I have changed Expression for the content, and basically I have lost it and this is how it looks like :-)

 

What you need to focus in the eventclass command begin and command end in my situation.

There is it, after a change I had expression that was there before

So from this screenshot we can clearly see that the change was on the roleID with a ID 3, and the I have ticked the checkbox Allowed (in the SQL management studio it is called enabled read permissions on the cell data page) and that the expression was the below. BINGO!

So this change was just before I deleted the whole expression and I am easily able to extract query for the SQL server studio and paste it again on the server.

SQL Server Analysis Services Flight Recorder from SQL server is a great tool to check really detailed recent changes on the SQL server, if you deleted something by accident or you don’t remember the exact something on the SQL server this can help you revert the changes. Play with it I am sure you will find it useful.

Posted in SQL

Citrix XenApp Installation Guide

This is a step-by-step Citrix XenApp installation guide created to help you quickly deploy and configure XenApp solution in your environment.

For those of you who have stumbled upon the XenApp for the first time, let me explain what actually XenApp does.

Citrix XenApp is application virtualization software that allows Windows applications to be accessed via individual devices from a shared server or cloud system without the necessity of installing them. With XenApp, Windows applications can be used on devices that typically could not run them (such as Mac computers, mobile devices…), but also it enables otherwise incompatible apps to run on Windows desktops.

The latest released version is Citrix XenApp 7.13., and this is the one we used for this guide.

Read more ›

Posted in Citrix

7 Things You Should Do When You Inherit An SQL Server

I’ve decided to put together this blog post as a beginner’s guide to dealing with an inherited SQL server environment.

You should read this if you’re going to be put in charge of the SQL servers in your company, for example if you’ve just been hired as the new DBA or if you’re a consultant or maybe a Windows admin tasked with the job of being a DBA as well.

If you’ve inherited a mess, where do you start? What do you do when the only guy who knows the infrastructure leaves?

Read more ›

Posted in SQL

Adding GUI to the server core aka Converting a Server Core to full GUI version Windows Server 2012 R2

Lately I had a lot of issues with converting three server core systems to a full GUI version of the Windows Server 2012 R2.

I have tried following stuff the would not work for me with the following error message (The source files could not be found. Use the “Source” option to specify the location of the files that are required to restore the feature. For more information on specifying a source location):

  1. mounted the Windows image to the drive and tried installing feature via Dism /online /enable-feature /featurename:ServerCore-FullServer /featurename:Server-Gui-Shell /featurename:Server-Gui-Mgmt (this should download the files from the windows update but it would not work)
  2. mount the windows image to the folder on some drive and then pull source from there the process includes following:
    – create a folder to mount the windows image mkdir c:\mountdir
    – determine the index number I need to use from the source of image Dism /get-wiminfo /wimfile:<drive>:sources\install.wim
    – Mount the WIM file using: Dism /mount-wim /WimFile:c:\sources\install.wim /Index:4 /MountDir:c:\mountdir /readonly (index can change in your environment depending on the output from the previous command in mine it was number 4 Datacenter
    – start the powershell in cmd and run: Install-WindowsFeature Server-Gui-Mgmt-Infra,Server-Gui-Shell –Restart  –Source c:\mountdir\windows\winsxs

Read more ›

Posted in Uncategorized

Downloads folder slow to load/sort in Windows 10

If you are like me you probably love to download various stuff from the internet and then over time when you have a lot of files in the downloads folder it is slow to load, or when you want to sort by date for example it takes forever or never finishes.

What happens here is is downloads folder is optimized as pictures folder so probably windows is trying to generate a thumbnail of the files/picture.

Here is the solution:

Read more ›

Posted in Windows general

Free Helpdesk ticketing solution for internal IT with Office 365 SSO

For our company communication with our clients we are using ticketing system Zendesk. I must say zendesk is a great solution that is very modern and fast, it comes for the with the integrated chat and phone as well.

We were looking for a solution where we will have a ticketing system for the internal infrastructure, because over the years we grew a lot and now we have few system engineers that are maintaining internal IT for the test systems for our developers. The Zendesk was great, but we are using SSO with it so in order to open ticket for the internal IT some people would need to logout from one zendesk system and then login to another via SSO or not SSO so that would possibly cause a huge mess. So we decided to look elsewhere.

What I found is great zendesk competitor Freshdesk which is free for unlimited agents in the basic edition:

Freshdesk Plans

Freshdesk is free and really easy solution that from now on we will use for internal IT needs. What is the best on the end not only freshdesk is free but the Single Sign On integration with Azure is free as well. Here is the tutorial on how to configure freshdesk to work with your existing Azure subscription so users can authenticate thru Azure Active Directory. Awesome stuff, great work guys!

Posted in Active Directory, Azure

Follow me:

  • Facebook
  • Twitter
  • Linkedin
  • Youtube