SharePoint Health Analyzer rules reference SharePoint 2013

Advertisements

Workflow Manager Disaster Recovery

Workflow Manager is an added value to the SharePoint 2013, in the previous versions workflow was part of the SharePoint on SharePoint 2013 there are 2 flavors you can use when developing workflows the first one is the SharePoint 2010 workflow and the second and new one is the new workflow manager which is now a separate component, when you install the new workflow manager it will allow to create SharePoint 2013 workflows.

The Architecture of the Workflow Manager depends on several separate databases to save the workflows so now the workflows are stored on those separate databases and not on the content databases so important in any backup and restore to make sure that all those workflows are backed up in case restoration is needed and while restoring there are important steps and scripts that needs to run in certain order to restore the workflow farm correctly.

I will be walking through the steps to restore the workflow farms and the scripts used can be found in this link Scripts

Backup Workflow Manager

There are some components/information which need to be backed up and stored in order to restore workflow manager.

  • Backup the following databases by following the Database backup procedures, the frequency of backup the DB is same as content databases.
Database Name Description
WFResourceManagementDB Workflow Manager Resource Management Store
WFInstanceManagementDB Workflow Manager Instance Management Store
SbGatewayDatabase Service Bus Gateway Database
SBMessageContainer01 – n Service Bus Message Container Databases
  • Document the users/passwords used while configuring workflow manager.
  • Copy all the certificates generated during initial Workflow Manager Configuration and the certificate password and Important to save the thumbnail of the service bus certificate.

Notes:

  • Certificates will exist on local certificates in personal folder and full trusted certificates folder all of them need to be exported specially the service bus certificate.
  • To know service bus certificate run the command Get-SBFarm the returned result will contain the farm certificate thumbprint, check this thumbprint on the local machine certificates in the personal folder.
  • Export certificate with private key.
  • Keep track of outage date and time when it happens as it will be needed.
  • Document all ports used during initial configuration.
  • Get the symmetric key using this command and save it. “Get-SBNamespace -Name WorkflowDefaultNamespace”.
  • Save all scopes names specially the one used to register the workflow manager farm to do that run the getscopes.ps1 script

Restore Workflow Manager

  • Prepare a new machine to install workflow manager on the same domain as old machines
  • Use the same accounts when the workflow manager was configured originally.
  • Make sure that App Management service/service application running and subscription service too
  • Make sure to have user profile and user profile synchronization running and run full synchronization.
  • Install workflow manager on new machine but don’t configure it.
  • To install workflow manager install web platform installer and search for “workflow manager refresh 1”
  • Restore the backed up databases.
    • Only the following database should be restored.
    • WFResourceManagementDB
    • WFInstanceManagementDB
    • SbGatewayDatabase
    • SBMessageContainer* (all message container DBs)
      • Note: Do not restore the WFManagementDB and SbManagementDb databases as they will be recreated as part of the restore operation.
  • Restore any needed content DBs, web applications and site collections.
  • Restore the certificates exported during backup.
  • Open an elevated SharePoint PowerShell (Run as Administrator) window on the new machine.
  • Copy all the scripts and the wfm.config to a folder and navigate to that folder from the SharePoint PowerShell console.
  • Run restoreWFStep1 script (modify the parameters in the script as appropriate first).
  • Run restoreWFStep2 script (modify the parameters in the script as appropriate first).
  • For each container database, run the previous step after modifying the parameters. (optional if more than container database exists)
  • Run restoreWFStep3 (modify the parameters in the script as appropriate and put the thumbprint for the imported certificate).
  • Run restoreWFStep4 (modify the parameters in the script as appropriate first).
  • Set-SBNamespace –PrimarySymmetricKey  keyvalue  -Name WorkflowDefaultNamespace
  • Open new SharePoint PowerShell as Administrator.
  • Navigate to the folder containing scripts from the SharePoint PowerShell console.
  • Run the following commands
    • $filename = Resolve-Path .wfm.config
    • [System.AppDomain]::CurrentDomain.SetData(“APP_CONFIG_FILE”, $filename.Path)
  • Run restoreWFStep5 script (modify the parameters in the script as appropriate first).
  • Run restoreWFStep6 script (modify the parameters in the script as appropriate first).
  • Adjust DNS for workflow manager.
  • In IIS make sure to have the binding with the port for http for the workflow manager IIS site
  • You may need to register the workflow farm with the scope name used in the original farm
    • Register-SPWorkflowService -SPSite “http:/webapplication” -WorkflowHostUri “http://workfowsite:12291 ” -AllowOAuthHttp -Force -ScopeName “MyScope”
    • This command needs to run on the SharePoint Farm.(It is optional step if the farm didn’t see the workflow)
  • Run the following timer jobs from the central admin
  • Workflow Auto Cleanup
  • Notification Timer Job
  • Hold Processing and Reporting
  • Bulk workflow task processing
  • Refresh Trusted Security Token Services Metadata feed [Farm job – Daily]

Notes:

To add other workflow manager servers, just import the certificate and run restoreWFStep4 (after modifying the parameters) and then run restoreWFStep6 (after modifying the parameters).

There are other cases for restoration, in case if the application servers are not affected only the SQL server in this case the uninstall workflow manager from existing application server and then follow the above steps, once done you can uninstall workflow manager from other nodes and install workflow manager  and reinstall it  but don’t configure it just use the commands in script of restoreWFStep4  and restoreWFStep6

There is other case when the the application server is affected, in this case prepare a new machine and apply the above steps, then for any other node you can install workflow manager but don’t configure it just run restoreWFStep4  and restoreWFStep6 in the scripts to add those nodes.

storage related performance issues sharepoint

Here are five storage-related issues in SharePoint that can kill performance, with tips on how to resolve or prevent them.

Problem #1:

Unstructured data takeover. The primary document types stored in SharePoint are PDFs, Microsoft Word and PowerPoint files, and large Excel spreadsheets. These documents are usually well over a megabyte.

SharePoint saves all file contents in SQL Server as unstructured data, otherwise known as Binary Large Objects (BLOBs). Having many BLOBs in SQL Server causes several issues. Not only do they take up lots of storage space, they also use server resources.

Because a BLOB is unstructured data, any time a user accesses a file in SharePoint, the BLOB has to be reassembled before it can be delivered back to the user – taking extra processing power and time.

Solution:

Move BLOBs out of SQL Server and into a secondary storage location – specifically, a higher density storage array that is reasonably fast, like a file share or network attached storage (NAS).

Problem #2:

An avalanche of large media. Organizations today use a variety of large files such as videos, images, and PowerPoint presentations, but storing them in SharePoint can lead to performance issues because SQL Server isn’t optimized to house them.

Media files, especially, cause issues for users because they are so large and need to be retrieved fairly quickly. For example, a video file may have to stream at a certain rate, and applications won’t return control until the file is fully loaded. As more of this type of content is stored in SharePoint, it amplifies the likelihood that users will experience browser timeout, slow Web server performance, and upload and recall failures.

Solution:

For organizations that make SharePoint “the place” for all content large and small, use third-party tools specifically designed to facilitate the externalization of large media storage and organization. This will encourage user adoption and still allow you to maintain the performance that users demand.

Problem #3:

Old and unused files hogging valuable SQL Server storage. As data ages, it usually loses its value and usefulness, so it’s not uncommon for the majority of SharePoint content to go completely unused for long periods of time. In fact, more than 60 to 80 percent of content in SharePoint is either unused or used only sparingly in its lifespan. Many organizations waste space by applying the same storage treatment for this old, unused data as they do for new, active content, quickly degrading both SQL Server and SharePoint performance.

Solution:

Move less active and relevant SharePoint data to less expensive storage, while still keeping it available to end users via SharePoint. In the interface, it helps to move these older files to different parts of the information architecture, to minimize navigational and search clutter. Similarly, we can “unclutter” the storage back end.

A third-party tool that provides tiered storage will enable you to easily move each piece of SharePoint data through its life cycle to various repositories, such as direct attached storage, a file share, or even the cloud. With tiered storage, you can keep your most active and relevant data close at hand, while moving the rest to less expensive and possibly slower storage, based on the particular needs of your data set.

Problem #4:

Lack of scalability. As SharePoint content grows, its supporting hardware can become underpowered if growth rates weren’t accurately forecasted. Organizations unable to invest in new hardware need to find alternatives that enable them to use best practices and keep SharePoint performance optimal. Microsoft guidance suggests limiting content databases to 200GB maximum unless disk subsystems are tuned for high input/output performance. In addition, huge content databases are cumbersome for backup and restore operations.

Solution:

Offload BLOBs to the file system – thus reducing the size of the content database. Again, tiered storage will give you maximum flexibility, so as SharePoint data grows, you can direct it to the proper storage location, either for pure long-term storage or zippy immediate use.

It also lets you spread the storage load across a wider pool of storage devices. This approach keeps SharePoint performance high and preserves your investment in existing hardware by prolonging its useful life in lieu of buying expensive hardware. It’s simpler to invest in optimizing a smaller SQL Server storage core than a full multi-terabyte storage footprint, including archives.

Problem #5:

Not leveraging Microsoft’s data externalization features. Microsoft’s recommended externalization options are Remote BLOB Storage (RBS), a SQL Server API that enables SharePoint 2010 to store BLOBs in locations outside the content databases, and External BLOB Storage (EBS), a SharePoint API introduced in SharePoint 2007 SP1 and continued in SharePoint 2010.

Many organizations haven’t yet explored these externalization capabilities, however, and are missing out on significant storage and related performance benefits. However, native EBS and RBS require frequent T-SQL command-line administration, and lack flexibility.

Solution:

Use a third-party tool that works with Microsoft’s supported APIs, RBS, and EBS, and gives administrators an intuitive interface through SharePoint’s native Central Administration to set the scope, rules and location for data externalization.

In each of these five problem areas, you can see that offloading the SharePoint data to more efficient external storage is clearly the answer. Microsoft’s native options, EBS and RBS, only add to the complexity of managing SharePoint storage, however, so the best option to improve SharePoint performance and reduce costs is to select a third-party tool that integrates cleanly into SharePoint’s Central Administration. This would enable administrators to take advantage of EBS and RBS, choosing the data they want to externalize by setting the scope and rules for externalization and selecting where they want the data to be stored.

 

Create Event Receivers using Visual Studio SharePoint2010

Microsoft Visual Studio 2010 provides a project type that enables you to build event receivers that perform actions before or after selected events on a Microsoft SharePoint 2010 site. This example shows how to add an event to the adding and updating actions for custom list items.

This SharePoint Visual How To describes the following steps for creating and deploying an event receiver in Visual Studio 2010:

  1. Overriding the itemAdding event and the itemUpdating event.
  2. Verifying that the list to which the item is being added is the Open Position list.
    Elevating permissions so that the code can access a secure site to retrieve approved job titles.

  3. Comparing approved Job Titles with the title of a new item that is created in the Open Position list.

  4. Canceling the event when the Job Title is not approved.

In this example, a secure subsite contains a list named Job Definitions that specifies allowed job titles for roles in the organization. Along with job titles, the list also contains confidential salary information for the job title and is therefore secured from users. In the main site, a list named Open Positions tracks vacancies in the organization. You create two event receivers for the itemAdding and itemUpdating events that verify that the title of the open position matches one of the approved titles in the Job Definitions list.

Prerequisites

Before you start, create the subsite and lists that you will need.

To create the Job Definitions subsite

  1. On the main site, on the Site Actions menu, click New Site.
  2. In the New Site dialog box, click Blank Site.
  3. On the right of the dialog box, click More Options.
  4. In the Title box, type Job Definitions.
  5. In the Web Site Address box, type JobDefinitions.
  6. In the Permissions section, click Use Unique Permissions, and then click Create.
  7. In the Visitors to this site section, select Use an existing group, and then select Team Site Owners. Click OK.

To create the Job Definitions list

  1. In the Job Definitions site, create a custom list named Job Definitions with the following columns:

Title (Default column)
MinSalary (Currency)
MaxSalary (Currency)
Role Type (Choice: Permanent, Contract)

  1. Add several jobs to this list. Note the titles that you specify for each job that you create because you will need them later.

To create the Open Positions list

In the parent site, create a custom list named Open Positions with the following columns:

Title (Default column)
Location (Single line of text)

Creating an Event Receiver

Next, create an Event Receiver project in Visual Studio 2010, and add code to the events receiver file.

To create a SharePoint 2010 event receiver in Visual Studio 2010

1. Start Visual Studio 2010.
2. On the File menu, click New, and then click Project.
3. In the New Project dialog box, in the Installed Templates section, expand either Visual Basic or Visual C#, expandSharePoint, and then click 2010.
4. In the template list, click Event Receiver.
5. In the Name box, type VerifyJob.
6. Leave other fields with their default values, and click OK.
7. In the What local site do you want to use for debugging? list, select your site.
8. Select the Deploy as a farm solution option, and then click Next.
9. On the Choose Event Receiver Settings page, in the What type of event receiver do you want? list, select List Item Events.
10. In the What Item should be the event source? list, select Custom List.
11. Under Handle the following events, select the An item is being added and the An item is being updated check boxes. Click Finish.

To modify the events receiver file

In the events receiver file, add the following code to the class.

bool checkItem(SPItemEventProperties properties)
{
string jobTitle = properties.AfterProperties[“Title”].ToString();
bool allowed = false;
SPWeb jobDefWeb = null;
SPList jobDefList;
SPUser privilegedAccount = properties.Web.AllUsers[@”SHAREPOINTSYSTEM”];
SPUserToken privilegedToken = privilegedAccount.UserToken;
try
{
using (SPSite elevatedSite = new SPSite(properties.Web.Url, privilegedToken))
{
using (SPWeb elevatedWeb = elevatedSite.OpenWeb())
{
jobDefWeb = elevatedWeb.Webs[“JobDefinitions”];
jobDefList = jobDefWeb.Lists[“Job Definitions”];
foreach (SPListItem item inItems)
{
if (item[“Title”].ToString() == jobTitle)
{
allowed = true;
break;
}
}
}
}
return (allowed);
}
finally
{
Dispose();
}
}

In the EventReceiver1 file, replace the ItemAdding method with the following code.

public override void ItemAdding(SPItemEventProperties properties)
{
try
{
bool allowed = true;
if (properties.ListTitle == “Open Positions”)
{
allowed = checkItem(properties);
}
if (!allowed)
{
properties.Status = SPEventReceiverStatus.CancelWithError;
properties.ErrorMessage =
“The job you have entered is not defined in the Job Definitions List”;
properties.Cancel = true;
}
}
catch (Exception ex)
{
properties.Status = SPEventReceiverStatus.CancelWithError;
properties.ErrorMessage = ex.Message;
properties.Cancel = true;
}
}

In the EventReceiver1 file, replace the ItemUpdating method with the following code.
public override void ItemUpdating(SPItemEventProperties properties)
{
bool allowed = true;
if (properties.ListTitle == “Open Positions”)
{
allowed = checkItem(properties);
}
try
{
if (!allowed)
{
properties.Status = SPEventReceiverStatus.CancelWithError;
properties.ErrorMessage =
“The job you have entered is not defined in the Job Definitions List”;
properties.Cancel = true;
}
}
catch (Exception ex)
{
properties.Status = SPEventReceiverStatus.CancelWithError;
properties.ErrorMessage = ex.Message;
properties.Cancel = true;
}
}
To deploy the project

1. In Solution Explorer, right-click the project, and then click Deploy.
2. In the SharePoint site, in the Open Positionslist, click Add new item.
3. In the Titlefield, provide a title for a job description that does not exist in the Job Definitions list in the secured subsite.
4. Click Save. You receive a message from the event receiver.
5. In the Titlefield, provide a title for a job description that exists in the Job Definitions list in the secured subsite.
6. Click Save. The position is created.
7. The solution overrides the ItemAddingand ItemUpdating methods and verifies whether the list that is being added to is the Open Positions If it is, a call is made to the CheckItem method, passing in the properties that are associated with the event.
8. In the CheckItemmethod, the permissions are elevated to ensure successful access to the secured subsite. The job titles that are in the approved list are compared to the job title of the AfterProperties property associated with the event. If any title matches, the allowedBoolean variable is set to true, and the method returns.
9. Depending on the value of the allowedvariable, the calling method either permits the event or sets theErrorMessage property and then cancels the event using properties.cancel.