Tag Archives: SharePoint 2013

PowerShell and SharePoint Modules

There are a lot of posts about creating modules, for SharePoint, in PowerShell. I decided to write this post so everything I’ve learned is in one place. I’m hoping you learn something from this (as I did).

I have tested the code in this article in SharePoint On-premise  2010, 2013 and 2016.

Modules do have the advantage of, once created and loaded, the functions in your module will be available in every PowerShell session.

Loading the Microsoft.SharePoint.Powershell Snapin

Making sure the Microsoft.SharePoint.PowerShell snapin is always loaded first. This way you are always certain the snapin is available and its not necessary to add it to every script file you create.

Note: this only has to be done once and will be available for all future PowerShell sessions.

Open the PowerShell ISE and run the following, at the command line.  This will create a new profile script (profile.ps1) or open if it already exists.

PS C:> if (!(test-path $profile.AllUsersAllHosts)) {new-item -type file -path $profile.AllUsersAllHosts –force}
powershell_ise $profile.AllUsersAllHosts

Now that the profile.ps1 file is open in the PowerShell ISE, add the following code.  If there is already code in the profile.ps1 file, make sure the following doesn’t already exist and, if not, add it.

$ver = $host | select version
if ($ver.Version.Major -gt 1) {$host.Runspace.ThreadOptions = "ReuseThread"} 
if ((Get-PSSnapin "Microsoft.SharePoint.PowerShell" -ErrorAction SilentlyContinue) -eq $null) 
{
Add-PSSnapin "Microsoft.SharePoint.PowerShell"
}

Here is what the above looks like in my PowerShell session.

profile.ps1 – click on the image to open, full size, in another browser tab.

Make sure to close this PowerShell instance and load a new instance. Test that the Microsoft.SharePoint.PowerShell module has been loaded into your new PowerShell instance, type
Get-SPFarm at the command line and it should now execute without any issues.

Now that we have the Microsoft.SharePoint.PowerShell module loading with every instance of PowerShell, lets move on to creating our first module.

Creating a Script/Module Template

A PowerShell module is fundamentally the same as any other script file with the addition of a few elements.  Let’s start by showing you the fundamental header, body and footer that is in every one of my script files.  In fact, I have this as a template file and use it each time I wish to create a new module.

<#
	.SYNOPSIS
	.DESCRIPTION
	.NOTES
#>
function Verb-YourFunctionName
{
    <#
    .SYNOPSIS

    .DESCRIPTION

    .PARAMETER Web

    .PARAMETER LeadingSpaceCount
        Used to control verbose output. If -1 (default) no verbose output will be rendered. Any value of zero or greater, will generate verbose output to the host. The value of this parameter is used to indent the verbose output. For example, a value of 2 will put two spaces before the output.

    .EXAMPLE
    .EXAMPLE
    #>
    [CmdletBinding()]
    Param(
        [Parameter(Mandatory=$true)][Microsoft.SharePoint.PowerShell.SPWebPipeBind]$Web,
        [Parameter(Mandatory=$false)]
            [System.Int32]$LeadingSpaceCount=-1
    )

    # Setup the leading space for verbose output.
    # If -1 (default) no verbose output will be rendered.  Any value of zero or greater, will generate verbose output to the host.
    # The value of this parameter is used to indent the verbose output.  For example, a value of 2 will put two spaces before the output.
    $ls = ""
    if($LeadingSpaceCount -ge 0){$ls = "".PadRight($LeadingSpaceCount," "); $LeadingSpaceCount++}
    if($LeadingSpaceCount -ge 0)
    {
        Write-Host "$ls- WHAT THIS FUNCTION DOES" -ForegroundColor Yellow
        $ls = "".PadRight($LeadingSpaceCount," ")
        $LeadingSpaceCount++
    }
    try
    {
        [Microsoft.SharePoint.SPWeb]$oWeb = Get-SPWeb $Web
    }
    catch
    {
        Write-Host ([System.String]::Format("$ls- EXCEPTION: [{0}] [{1}]", $sScriptName, $_.Exception.Message)) -ForegroundColor Red
        Write-Host
        throw $_.Exception
    }
    finally
    {
        if($oWeb -ne $null){$oWeb.Dispose()}
    }
    return($SomeReturnVar)
    }

#region Module Members to be Exported
if([System.IO.Path]::GetExtension($MyInvocation.ScriptName) -like ".psm1")
{
    Export-ModuleMember -Function Verb-YourFunctionName -ErrorAction SilentlyContinue
}
#endregion

The header section of the module is simply a set of comments describing the functionality contained within the script/module file.  I tend to breakup my functions into separate files so maintenance is simplified.

The body section of the module is the function or code we are writing.  In the case of my template, the function is named Verb-YourFunctionName.  When being used, the function name will be modified to match the need.

The footer section is where we tell the Add-PSModule code (in the next section) what functions are to be included in our module.

#region Module Members to be Exported
if([System.IO.Path]::GetExtension($MyInvocation.ScriptName) -like ".psm1")
{
    Export-ModuleMember -Function Verb-YourFunctionName -ErrorAction SilentlyContinue
}
#endregion

You will change the Verb-YourFunctionName to the actual name of your function in this module.

Using Add-PSModule to Add Your Module

The last step in this process is to use the Add-PSModule to add the module to your PowerShell session.

Personally I so this in a PowerShell function; one for each different module I wish to create.  For example, I have BMCommon, BMActiveDirectory, BMSharePoint2013, etc. modules.  In most of my client environments, I install these modules so all of my extended functions and variables are available to all instances of PowerShell.

function Add-BMSharePoint2013PSModule
{
    Param(
        [Parameter(Mandatory=$false
            [string]$ModuleInstallPath = "C:\Windows\System32\WindowsPowerShell\v1.0\Modules",
        [Parameter(Mandatory=$false)]
            [int]$LeadingSpaceCount=-1
    )

    $ls = ""
    if($LeadingSpaceCount -ge 0){$ls0 `
        "".PadRight($LeadingSpaceCount," ");
        $LeadingSpaceCount++}

    $sScriptSourcePath = Split-Path -parent `
        $PSCommandPath
    $sModuleName = "BMSharePoint2013"
    $sManifestDescription = "Contains SharePoint 2013 management extended functions."

    $aScriptFiles = "Add-ADUserToSPGroup.ps1",
        "Add-SPAppTile.ps1",
        "Add-SPSitePermissionLevel.ps1"

    $bSuccess = Add-PSModule `
        -ScriptSourcePath $sScriptSourcePath `
        -ModuleName $sModuleName `
        -ScriptFiles $aScriptFiles `
        -ManifestAuthor "Bob Mixon" `
        -ManifestCompanyName "Bob Mixon" `
        -ManifestDescription $sManifestDescription `
        -ModuleInstallPath $ModuleInstallPath `
        -LeadingSpaceCount $LeadingSpaceCount

    return($bSuccess)
}

Add-BMSharePoint2013PSModule -LeadingSpaceCount 0

 

This is all you really need to do!  The high-level steps for accomplishing this are:

  1. Make sure the PowerShell snapin “Microsoft.SharePoint.PowerShell” has been loaded in the PowerShell profile.ps1 file.
  2. Create your new function script file/module; include the Export-ModuleMember at the bottom of the script file/module.
  3. Create and run your module to create and install the module.

Once these steps are complete, you can close your PowerShell instance, open another and your functions will be available.

 

Document Libraries and Folders

I am asked on a regular basis how I feel about the use of Document Library folders. To be honest, I always tell people “avoid them” unless you thoroughly understand the consequences of using folders or have a controlled means of promoting consistent use. This response always stirs up controversy so I thought I’d write this article to help you better understand my thinking.

Before I dive too deep in to this article, I need to first say; there is no right or wrong way of implementing your document management solution. There are simply better ways of surfacing information and improving the user experience. If you are concerned about findability and user adoption, then read-on! Another prerequisite to reading this article is to understand, this is not a black and white topic; meaning, it’s not an all or nothing issue. I believe document library folders have a place and can be used as long as the consequences are thoroughly understood.

Do I use them in my implementations? Absolutely… However, I do so in a very limited manner and very specific ways. With all that said, let’s move on…

First, let’s take a look at why you would use document library folders. The only reason that immediately comes to mind is; because that is what we are used to doing! I personally don’t think this is a great reason, but it does hold some truth. We have pushed users to using file share and local drives all of which have folder and nested folders.

Now lets take a look at why we shouldn’t use document library folders; or ways of improving the users experience.

Navigating document library folders is not the same as what users experienced when storing/managing files on file shares and other drives. The tree view, with +/- to expand/collapse branches, doesn’t exist in the out-of-box Document Library Web Part. Navigating down a folder branch requires the simple click of a link. However, navigating back up the branch requires the user to click the browser back button. This is a cumbersome user experience at best.

Just so you are aware, there are options to help solve some of the navigation issues described above. You could write a custom Web Part, find publicly available code or purchase a 3rd party product. There is code and products available. However, I would first recommend you continue reading because these products won’t solve all of the problems you will have.

In addition to the navigation issues described above, folders hide the content contained within; especially when you use sub-folders. For example, when a user arrives at a site and they see a library named Shared Documents (which I also don’t recommend using), first off the name of that library doesn’t tell them anything about what it contains. Also, if the user clicks on the library, all they see is the first layer of folders. This may begin to indicate what files are contained within those folders but doesn’t suggest anything about what might be contained within sub-folders. Again, hiding content and detail from users; forcing them to traverse folders until they find what they are looking for. Not very efficient at all.

Remember, if it takes a long time for a user to find an important document, quite often they will make a local copy for future reference; so they don’t have to go through the pains of finding it again on your Intranet. Findability is extremely important!

I also have to make a comment about manually managing folder-level security; avoid it at all costs! The use of it can and will become confusing for your user-base. I have seen structures become so complex, contributors loose comfort knowing where to store and manage their content. And what happens when contributors loose comfort and security with the solutions we implement? They stop using them and go back to what they feel secure and comfortable with… Something else to remember, security is one of the most complex features to teach a non-technical user. If you are expecting your non-technical user-base to manage item and folder-level security, it will eventually be done so incorrectly; this brings up a whole different set of issues (good topic for another article). And, if you aren’t expecting the general non-technical user to manage their own security (which is good, I don’t recommend it) then you are placing the burden of that management on Site Collection Administrators, IT or whomever you have it assigned to.

Lastly, let’s talk about content duplication and single source of truth. Content duplication occurs in an organization in many different ways; so many that we don’t want to add to the problem. Folders hide content. As such, if it takes a long time for a user to find an important document, they are more likely to make a local copy of that document so they don’t have to find it again in the future. Another way duplication occurs by using folders, is by simply needing a document to be present in multiple folders. For example, let’s say you have a document that is related to 2 of 3 areas of Human Resources Benefits; related to Medical and Dental Benefits but not Vision Benefits. If you have benefit documents broken down in folders by benefit type (which is common), you may have to duplicate this common document in 2 places. Again, we have to avoid the duplication of content on our document management system at all costs. It’s called managing the single source of truth!

So… what can be done to solve these problems, improving the user experience and overall findability?

First and foremost, consider using metadata! Whether you choose to continue using a Document Library named Shared Documents or not, I would recommend you start applying Content Types and metadata to your documents. As a simplified example, move all of your documents in a library to the root folder and apply metadata that can be used to filter the view. Still not the greatest user experience; however, a user will at least be able to see all of the files contained within a library simply by clicking the library link. They can then filter the view down using the metadata column filtering abilities built in to SharePoint.

Using more Document Libraries is an even better approach to solving the overuse of folders issue. Let me demonstrate; a user arrives at a Human Resources site and is presented with a Document Library named Shared Documents, with folders and sub-folders. Well, you already know how I feel about this user experience. Now improve the experience for the user by creating multiple Document Libraries; one named Medical Benefit Documents, another named Dental Benefit Documents and another named Vision Benefit Documents. Using this approach alone will significantly improve the user experience from a findability perspective. The Medical Benefit Documents library has a Content Type associated with it named Medical Benefit Document; which forces the contributor to store and manage only documents of type Medical Benefit. This also helps with your overall IA too; but that’s a whole different article series! Just be certain to not have your Document Libraries so generic that you have to manage too many Content Types with it; this can also be confusing for contributors.

Another approach, and one I use all the time, is to build your hierarchy with sub-sites. Take Human Resources Benefits for example. You may be better suited having a Benefits sub-site then sub-sites below that, one for each specific type of benefit (Medical, Dental and Vision). There are many, many advantages to using sub-sites in this manner. Sub-sites, used in this manner are really nothing more than what is called “a point of aggregation and association”. Let me provide you with another example. On a Medical Benefits sub-site you can store and manage related announcements, policies, procedures, forms, FAQ’s, glossary terms, links to external references and so on. You are basically bringing the user to a new level of experience when you aggregate all information associated with Medical Benefits on a single sub-site.

Another advantage to using sub-sites as a point of aggregation and association, is realized when you begin to aggregate content to a higher level on your Intranet. For example, if you choose to aggregate all Policy Document (documents of type Policy) to an aggregate view using the Content Query Web Part, those policies can be grouped by site/sub-site. This is a very useful view and cannot be produced automatically using folders.

So how do you handle the folder-level security confusion issue described above Bob? Great question, and I’m glad you asked! Remember, my implementations utilize sub-sites and multiple document libraries. I simply place a Content Editor Web Part above the Document Library Web Part and use it to describe the libraries intended use and security. I target this Web Part to the contributor audience so it is only displayed for those who are contributors on that specific site or page.

The document management/Intranet solutions I have implemented, contain a wide rage of the techniques described above. The actual implementation techniques used all depends on the size of organization, amount of content, contributors, consumers, security and a myriad of other requirements. It is my hope that some of these techniques help with your next implementation or give you ideas of how to improve your current implementation.

Please leave questions and comments below as it tends to help me understand what other articles I need to write!

Until next time…

PowerShell: Retrieve All Sites With or Without Personal Sites (MySites)

I love using the pipeline in PowerShell.  Recently I was in a situation where, if a PS script what a one-liner I didn’t have to submit it through the review/approval/change management process.  Its amazing what you can do in PowerShell in a single line of code!

Retrieve all site collections, for all web applications in the current farm.

$oSites = Get-SPWebApplication | Get-SPSite -Limit All

That one is pretty simple and I’m sure you’ve done it a bazillion times.  Lets add a little more and return all site collections, for all web applications except personal sites!

$oSites = Get-SPWebApplication | Get-SPSite -Limit All | where {($_.RootWeb.WebTemplateId -ne 54) -and ($_.RootWeb.WebTemplateId -ne 21)}

The only thing I’ve added here is a where clause that excludes sites with a template ID of 54 (personal site host) and 21 (personal sites).

Debugging SharePoint Issues and ULS Log Files

I often see administrators and developers new to SharePoint find debugging difficult and complex.

When working with SharePoint, log files are your friend.  In large on-premise farms, locating issues within large log files can be time consuming and sometimes difficult.

When I am presented with an error that contains a correlation ID, I first resort to PowerShell instead of a ULS Viewer.

Two PowerShell cmdlets that are your friend are: Get-SPLogEvent and Merge-SPLogFile.

Before you can use these cmdlets in your PowerShell scripts, make sure to load the SharePoint PowerShell snapin.

if((Get-PSSnapin -Name Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue) -eq $null)
{
    Add-PSSnapin Microsoft.SharePoint.PowerShell -ErrorAction SilentlyContinue
}

Get-SPLogEvent

The Get-SPLogEvent cmdlet will retrieve specific events from a ULS Log File.  For example, the following call will retrieve all entries that occurred during a specified time range:

Get-SPLogEvent -StartTime "12/04/2007 17:00" -EndTime "12/04/2007 18:00"

If you wish to retrieve ULS entries associated with a specific correlation ID, you can use the following:

Get-SPLogEvent | ? {$_.Correlation -eq "<Correlation ID>"} | Select Area, Category, Level, EventID, Message

Where <Correlation ID> is the id you wish to filter.

If you wish to display the results in a nicely formatted list, add Format-List:

Get-SPLogEvent | ? {$_.Correlation -eq "<Correlation ID>"} | Select Area, Category, Level, EventID, Message | Format-List

Be patient when running the Get-SPLogEvent cmdlet as it can take quite a long time to traverse through all the ULS log files.

I have a diagnostics PowerShell library that contains many functions that simplify diagnosing issues, writing log files, etc.  One of the functions in this library is my Get-SPLogEventByCorrelationID.  Which simply calls the Get-SPLogEvent cmdlet and filters the results by a specified correlation ID.

function Get-SPLogEventByCorrelationID
{
    [CmdletBinding()]
    Param([Parameter(Mandatory=$true)]
        [string]$CorrelationID
    )
    $logEntries = Get-SPLogEvent | ? {$_.Correlation -eq $CorrelationID} | Select Area, Category, Level, EventID, Message
    return($logEntries)
}

For more information on using the Get-SPLogEvent cmdlet, see the following:

Merge-SPLogFile

The Merge-SPLogFile cmdlet combines ULS log entries, from all servers in a SharePoint farm, to a single (specified) log file.

The following example will merge all ULS log files for the last hour:

Merge-SPLogFile -Path "C:\Logs\FarmMergedLog.log" -Overwrite

If you wish to merge all ULS log events for a specific correlation ID, you can use the following call:

Merge-SPLogFile -Path "C:\Logs\FarmMergedLog.log" -Correlation "<Correlation ID>" -Overwrite

Where <Correlation ID> is the id you wish to filter.

As with the Get-SPLogFile, I have included some common functions in my diagnostics library.  One that I use on a regular basis is Merge-SPLogFileByCorrelationID

function Merge-SPLogFileByCorrelationID
{
    [CmdletBinding()]
    Param(
        [Parameter(Mandatory=$true)]
            [string]$CorrelationID,
        [Parameter(Mandatory=$false)]
            [bool]$Overwrite=$false,
        [Parameter(Mandatory=$false)]
            [int]$LeadingSpaceCount=0
)

    $ls = "".PadRight($LeadingSpaceCount," ")
    $diagConfig = Get-SPDiagnosticConfig
    $ulsLogLocation = $diagConfig.LogLocation + "\MergeLog-Correlation (" + $CorrelationID + ").log"
    Write-Verbose ([string]::Format("$ls- Writing merged logs to file [{0}].", $ulsLogLocation))
    if($Overwrite)
    {
        Merge-SPLogFile -Path $ulsLogLocation -Correlation $CorrelationID -Overwrite
    }
    else
    {
        Merge-SPLogFile -Path $ulsLogLocation -Correlation $CorrelationID
    }
}

Other References

Conclusion

With a little knowledge and tools, you can become efficient at debugging issues in SharePoint.  If you would like a copy of my diagnostic script, please contact me; I will be happy to send it to you.

Happy SharePointing!

AutoSPInstaller: SharePoint 2013 March 10, 2015 CU (KB2956166)

The AutoSPInstaller tool is not something I’ve written about in the past.  I’m not sure why because its a fantastic tool and I use it on a regular basis.  In general, it is a PowerShell based SharePoint installation tool.  If you are unfamiliar with it, I do recommend you take a look at it here.

I use AutoSPInstaller to build SharePoint farms including the creation of web applications, site collections, installation of PU’s and CU’s, etc.  The beauty of using a scripted approach is its consistent.  If your farm ever burns to the ground, its a way to rebuild it just as it was.

I recently used AutoSPInstaller to build a SharePoint 2013 farm for a client.  We then made the decision to install SharePoint 2013 March 10, 2015 CU.  As with all farm-level modifications I make, I added this CU to the AutoSPInstaller updates directory then ran the installer again.  It ran the CU, psconfig and ensured the farm was in an operational state.  It worked flawlessly!

To accomplish this, download the (3) SharePoint 2013 March 10, 2015 CU installation files from the Microsoft site and place them in the SP\AutoSPInstaller\2013\Updates directory.  You then run the launch script again, on all servers, and the cumulative update will be installed for you.  In addition, the script will run psconfig; so you don’t need to do than manually!

Lightning Conductor 2013 and Aggregating Form Library Content

I have been working with the new Lightning Conductor 2013 content roll-up web part recently, specifically the new custom column abilities. This web part allows you to add a custom column to a view and it can contain any jscript and CSOM. This opens up a new world of possibilities.

If you are unfamiliar with the Lightning Conductor 2013 roll-up web part, I highly recommend getting familiar with it. Lightning Tools has versions for both on-premise and Office 365 environments.
For more information on The Lightning Conductor 2013 product, please click here.

As I was saying above, you can create a custom column and include any jscript and/or CSOM calls. Today, I ran in to a situation where we were aggregating content from a form library. As you may be aware, there isn’t a Title column present on the Form content type, so you need to include the Name field if you wish the file name displayed. The problem with this is the file name contains the .xml extension; and my client didn’t want that displayed.

This is where the Lightning Conductor 2013 custom field came in to play. I simply created a new custom field, named FormattedTitle, and included the following:

[FileLeafRef].Substring(0,[FileLeafRef].Length – 4)

Worked like a charm!

I’ll be writing more about this web part, especially how the jscript and CSOM custom column can be extended to related data in, etc.

If you are interested in purchasing the Lightning Conductor 2013 web part, please contact CollectiveKnowledge Solutions.

Redirect Users to a New Location Automatically

Recently, while working with a client, I noticed they had purchased a 3rd party web part to redirect users from one location to another.  There are other posts that describe a simple, out of box, solution; however, I wanted to include it here for my readers and clients.

There isn’t any need to write custom code or purchase a 3rd party product to redirect users from one location to another.

To accomplish this, use the following steps:

Let’s say, for example, you are in the process of migrating your SharePoint environment to a newer version.  As you perform the migration, you may wish to have a redirect on the old site sending the user to the new (migrated) site.

1 – On the site you wish to perform the redirect, add a Content Editor Web Part.

Add Content Editor Web Part

I’ve given the web part a title of “Redirect to New HR Site”.

2 – On the ribbon bar, in the Format Text tab, click the Edit Source button.  Then enter the redirect source.

RedirectUsers02

 

The actual redirect is the <meta …/> tag line, everything else is simply telling the user what will happen.  I feel it is always a good idea to inform the user of what is about to happen and ask them to update their bookmark.

The actual redirect is formatted as follows:

<meta http-equiv=”refresh” content=”{seconds};url={target URL}“/>

Replace {seconds} with the number of seconds to delay before redirecting the user.  10 seconds is a very common delay.

Replace {target URL} with the URL of where you with to redirect the user to.

Example, delay 10 seconds and redirect to HR site:

<meta http-equiv=”refresh” content=”10;url=https://cks.sharepoint.com/sites/contoso/departments/hr/”/>

 

 

 

10 Key Elements of Enterprise Information Architecture (#IA) – #3 Presentation

This article is the third in a series of ten that will help you better understand the 10 key elements of Enterprise Information Architecture.  To read the previous articles and the complete table of contents in this series, please click here.

Presenting Information
The manner in which information is presented can dramatically improve its value. Applying Information Architecture principals and techniques can afford us the ability to query and present information in many different formats thus supporting many different business contextual needs.

During the Subject Matter Expert (SME) interview process, used in the Understanding phase, formulate questions that inspire individuals to describe how they use and view information. Some information is best displayed as simple, linear lists. Other information is best displayed as charts, graphs in a dashboard and points of aggregation. During these same interviews and other assessment techniques, you can learn more about security requirements through the understanding of persona-based information access needs. Specific personas will have different visibility of information.

Note:
It has been my experience that applying techniques that simplify the storage and management of content be your initial focus. You can then aggregate that information, through specific queries, and present it in a manner that best suits the consumer persona and contextual needs.

Multiple presentation models to support varying contextual needs

  • Lists
  • Dashboards
  • Printed reports
  • Charts
  • Search
  • Content aggregation, Scoping and Faceted Filtering

Persona-based Presentation

  • Senior Management
  • Departmental, Line-of-Business Management, Business Unit Management
  • Employees
  • Customers
  • Vendors
  • Partners

05.03.03 - Presetation

As mentioned before, the interview process will provide you with a wealth of information about the consumers of information and the best manner in which to present it. Other techniques you can employ to understand presentation requirements include usability studies, wire-frame design diagrams, screen mockup’s and proof of concept or pilot projects.

Techniques for understanding how information is to be presented include:

  • Interview information SME’s and consumers
    • Understand how they use the information in their day-to-day business operations leads to the best approach for presenting information
  • “Day in the life of” scenarios
  • Usability studies
  • Wire-frame diagrams
  • Screen mockup’s
  • Proof of Concept’s (POC’s)

In SharePoint, information is stored and managed in sites, pages, lists and libraries. Other information can be incorporated using various tools, such as Business Connectivity Services (BCS), Excel Services, custom development, etc. To gain the most value presenting this information, you then need to first apply Information Architecture techniques; categorization, grouping, metadata, etc. Once you have architected your information, it simplifies the presentation of that information.

Create content scopes (result sources) to group information for aggregate presentation and ad-hoc search. You can then utilize the new faceted filtering (refinement) features of SharePoint 2013 to refine the scoped content and produce highly relevant set of results; to support various business contextual needs.

More on this in later articles!

 

10 Key Elements of Enterprise Information Architecture (#IA) – #2 Understanding

This article is the second in a series of ten that will help you better understand the 10 key elements of Enterprise Information Architecture.  To read the first article and the complete table of contents in this series, please click here.

Understanding Information
Understanding information is the most important aspect of Information Architecture. Before we can create solutions around information, we must thoroughly understand how people use, think about and value it. This understanding of information can then drive solution implementation prioritization, trust in its accuracy/use and ability to aid with and improve day-to-day business operations.

Understanding information leads to:

  • How people think about and value information
  • How information is used by people and processes
  • How information is stored and managed
  • Identification of information ownership, responsibility and accountability
  • Standard naming conventions
  • Reduction in term ambiguity

05.03.02 - Contract Term
Most people use terms and names that have meaning to them. For example, when an employee in IT uses the term Contract, they could be referring specifically to a Service Level Agreement (SLA) Contract. As humans, we may be able to automatically derive the understanding of a topic by applying scope and context.

Using the previous example; I am talking to an IT employee about server down time, I understand the term Contract means SLA. If I am unclear, I ask!

Unfortunately, technology doesn’t have the ability to automatically derive this level of scope and context. For technology to support the various contextual needs, we must categorize and label information; i.e. the basis and need for Information Architecture.

As Information Architects, there are many techniques that can be used to better understand how information is used. Having a thorough understanding of how information is used in day to day business operations is critical to designing and building a SharePoint solution that ultimately adds value.

Unfortunately, we cannot be experts in all areas of business within our organization. As such, the best approach to understanding information is to ask the experts. You will gain a wealth of knowledge by interviewing business domain SME’s, users (consumers), vendors and customers.

Most individuals in an organization are busy and may not have ample time describing what they do and how they do it. In many cases, you can better prepare yourself for these conversations through self-education. A wealth of information and knowledge can be derived by inventorying existing file structures, file naming conventions and supporting systems.

If your organization has search tools, review and assess logs; many times this can provide you with insight in to what consumers are search for.

Techniques for understanding information include:

  • Interview domain subject matter experts (SME’s)
    • Business domain SME’s, Users, Vendors, Customers, etc.
  • “Day in the life of” scenarios
  • Card sorting sessions
  • Inventory content
    • Assess, audit, refine, prioritize, label and categorize
  • Often file structure hierarchies and existing website navigation taxonomies aid in understanding how users currently categorize and think about their content
  • Review search logs

Often, as IT personnel, we fall in to the “build it and they will follow” trap. This is a recipe for failure with these types of solutions. Remember, our user base has had free access to all their content when stored on file shares, local drives and other repositories. To simply pickup their content from those repositories and drop it in to SharePoint adds no business value at all. In fact, doing so makes managing their documents more complex. For our user base to see value in using a more complex approach to managing their documents, we must add business value. The only way to add business value is to understand their information and how it is used. Only then can we transform the way information is used and improve/streamline day-to-day business operations.

SharePoint 2013 Intranet Management, Design and Architecture Training

These ten elements are defined, in detail, in my SharePoint 2013 Management, Design and Architecture Training Course.  For those of you who recall my Information Architecture (#IA) course for #SharePoint 2010, this new course expands on all the new features of both on-premise and Office 365 environments.

If you are interested in learning more about how to implement a #SharePoint 2013 Intranet solution, please register for our next class.

Content Categorization – Common Categorization and Grouping Mistake

Staying in alignment with my 10 Key Elements of Enterprise Information Architecture, this article will describe a common categorization and grouping mistake I see on a regular basis.

With regards to content categorization and grouping, a common mistake I see is storing and managing all documents of similar type, across ownership boundaries, in to a single site or library. This type of content grouping considers the consumer only, not the content owners or contributors. For example, I often see a site with a single library to store and manage all corporate policy documents. Using this approach causes many issues; including:

  • All policy document content owners, across department boundaries, must navigate to the policy site/library to add, edit or delete their owned policy documents. Doing this requires them to leave their departmental content domain.
  • Because all policy documents are stored in a single location, typically custom development efforts must be applied to ensure a consistent security model. Meaning, policy document owners can add, edit and delete only their own policy documents. Such custom development efforts can be very time consuming, dip deep in to your implementation budget and require on-going maintenance.
  • Again, because all policy documents are stored in a single location, typically custom new document provisioning, document edit, document delete, review and approval workflow must be developed.

05.03.01.01 - Aggregation for Consumers
It is important to consider the consumer of content. However, long before this, you must consider the content owner and contributor. One of the top factors for implementing successful Intranets is to consider the storage and management of content first. If we don’t simplify the storage and management of content, adoption and use will be minimized.

Duplicating the approach above, now your users will need to navigate to many different locations to manage their content.  Often, I see these business owners need to maintain many bookmarks to all the various sites where they manage content. This can, and will, become difficult to manage and can reduce solution adoption.

An optional approach is to store and manage the content as close to the point of ownership as possible. In the example above, policy documents should be stored and managed in the owning departments site. This eliminates many of the issues related to a central point used to store all related documents. Storing content as close to the point of ownership as possible has the following benefits:

  • Content owners and contributors don’t need to remember where content is stored.
    • They simply go to their department site (or sub-site) and manage the policy documents they own.
    • Security can be configured such that all departmental content owners and contributors can manage the policy documents as needed. These departmental content owners and contributors cannot change any other departments policy documents; and vice versa.
  • No custom coding is necessary to support a complex item-level security model.
  • Others changing content they don’t own is greatly reduced.
  • Single source of truth is maintained.
  • Higher degree of confidence and adoption will be improved.
  • There is no need for complex review and approval workflows. Most of these can be created using out-of-box workflows or SharePoint Designer.
  • Maintenance is significantly lower.

05.03.01.01 - Storage for Contributors
I would imagine your next question will be; How do I create a aggregate view of all corporate policies for the consumers of our organization? That would be a great question and it involves simple Information Architecture techniques.

Create a Policy Document content type and assign it to each departmental document library named Policy Documents. The only type of document that can be stored in the Policy Documents library are those of Policy Document. Once complete, it is a simple process to aggregate all documents, of type Policy Document, to a consumer site or page. The consumer site or page doesn’t have any content stored at all, it’s simply a dashboard displaying an aggregate view of all policy documents.

Using the content search or search web parts, the consumer aggregate view can be grouped by department and faceted filtering can be applied.

Using this approach, considering the content owner and contributor, should be the first on your list. Implement your solution using sound Information Architecture techniques, such as the one described above, and you do so much more with your corporate content.

Consider hiring a solid Information Architect next time. I promise the overall implementation time and costs will be lower and the value of your information use improved…

SharePoint 2013 Intranet Management, Design and Architecture Training

These ten elements are defined, in detail, in my SharePoint 2013 Management, Design and Architecture Training Course.  For those of you who recall my Information Architecture course for SharePoint 2010, this new course expands on all the new features of both on-premise and Office 365 environments.

If you are interested in learning more about how to implement a SharePoint 2013 Intranet solution, please register for our next class.

 

10 Key Elements of Enterprise Information Architecture (#IA) – #1 Content Categorization

This article is the first in a series of ten that will help you better understand the 10 key elements of Enterprise Information Architecture (#IA).

Information Architecture consists of many techniques and principals that are required to ultimately add value to day-to-day business operations. When we look at everything involved with Information Architecture, it can be overwhelming and complex. For these reasons, I have broken down Information Architecture in to the following 10 Key Elements. These are by no means all inclusive but what can be considered the most important.

  1. Content Categorization (this article)
    Content Categorization – Common Categorization and Grouping Mistake
  2. Understanding
  3. Presentation
  4. Evolution
  5. Responsibility
  6. Process
  7. Metadata
  8. Search
  9. Security
  10. Governance

Content Categorization

In this first article, we will focus on Content Categorization.  It is the first of the 10 Key Elements of Enterprise Information Architecture and , surprisingly, the one that is most often overlooked.

If you currently have SharePoint installed in your environment and are encountering issues, such as little organization, users unable to find their content, low adoption rate, etc., then you have most likely implemented your solution with little to no Information Architecture.

Content Categorization and classification is the process by which we identify and group content.  One key success factor for all SharePoint implementations is to reduce, if not eliminate, the question “where do I store and manage my content”.  Content Categorization and classification aids with this by providing a specific location and manner by which users store and manage their content.

Content Categorization and classification is one of the primary ways we query content for improved search relevancy and aggregation.

Content Classification
Content Classification consists of labeling types of content using labels that precisely describe what the content is. For example instead of all documents being labeled as merely a Document type, we classify our documents with more precise labels; such as Policy Document, Client Contract Document, Vendor Contract Document, Project Plan Document, Medical Benefit Document, etc.

Content Classification in SharePoint is accomplished by using content types. A content type defines a single data type, such as Policy Document, and supports associating metadata, a template document, workflow and policies.

05.03.01 - Policy Document Content Type

Once your content has been classified, it becomes a relatively simple process to create search scopes (result sources) and points of aggregation. For example, it is quite easy to create a result source containing all documents of type Policy Document. We can then search all documents of type Policy Document and/or aggregate all Policy Documents to a policy book.

Content Grouping
Grouping consists of grouping content of similar type. For example a document library, in SharePoint, named Documents really doesn’t have much meaning and certainly doesn’t tell a user what is stored in it. However, a document library named Policy Documents or Benefit Documents clearly defines what is contained within.

05.03.01 - Policy Documents Library

Another example would be a Meeting Documents library, on a project site. The Meeting Documents library might contain meeting agenda, meeting minutes and meeting action items documents. In this example, the Meeting Documents is the document grouping principal and is a container for managing all documents related to project meetings.

Content Grouping isn’t limited to lists and libraries. Each container level in SharePoint can be used to apply grouping principals. Each of the following is a grouping container in SharePoint.

  • SharePoint Farm – The top-most grouping level in SharePoint.       A corporate solution will have 1 to many SharePoint farms.
    • Web Application – Each SharePoint farm will include 1 to many web applications.
      • Site Collection – Each web application will contain 1 to many site collections.
        • Top-level Site – Each site collection will have a single top-level site.
          • List – Each top-level site can have 0 to many lists.
          • Document Library – Each top-level site can have 0 to many document libraries.
            • Folder – Each library can have 0 to many folders. (See Note Below)
          • Sub-sites – Each top-level site can have 0 to many sub-sites.
            • List – Each sub-site will have 0 to many lists.
            • Document Library – Each sub-site can have 0 to many document libraries.
              • Folder – Each library can have 0 to many folders. (See Note Below)

 

As you can see there are many grouping levels that can be applied to your SharePoint implementation. Each of these grouping levels has a very specific purpose and is used for many different reasons.

The goal is to reduce the use of folders and expand them to document libraries instead. It is also very common to group similar information, by topic, by site. For example, you may wish to have a sub-site, below the top-level site for each department, titled Policies & Procedures. This site would contain many lists and libraries all related to human resources policies & procedures; i.e. policy documents, procedure documents, FAQ’s, glossary terms, etc.

Note About Library Folders

The regular use of folders in libraries is not considered a best practice. There are very specific cases when folders do make sense and are used; however, these cases are limited.

SharePoint 2013 Intranet Management, Design and Architecture Training

These ten elements are defined, in detail, in my SharePoint 2013 Management, Design and Architecture Training Course.  For those of you who recall my Information Architecture course for SharePoint 2010, this new course expands on all the new features of both on-premise and Office 365 environments.

If you are interested in learning more about how to implement a SharePoint 2013 Intranet solution, please register for our next class.

June 19, 2014 – SharePoint 2013 Collaboration in the Cloud – Office 365

svspuglogo

The Silicon Valley SharePoint User Group (SVSPUG) would like to invite you to our next meeting:

Meetings are held on the 3rd Thursday of every month starting at 5:30 pm and ending at 8:00 pm.

Date Thursday June 19th, 2014
Time 5:30 p.m. – 8:00 p.m.
Location Santa Clara Valley Transportation Authority
3331 North First Street, Building “A” Auditorium
San Jose, CA 95134
(Parking in the back of the building)

SharePoint 2013 Collaboration in the Cloud – Office 365

Agenda:

  • 5:30 – 6: 00 p.m. – Arrive and socialize
  • 6:00 – 7:30 p.m. – Session
  • 7:30 – 8:00 p.m. – Q&A, Wrap-up

In this session, – Ulysses Lugwig, will discuss the following topics:

Office 365 has come out strong in 2013 with many organizations choosing to add Office 365 in their enterprise agreements with Microsoft. Where SharePoint 2010 and 2013 on premise both have promising features for external collaboration, many companies instead use box or dropbox due to firewall limitations, security, and complex forms based authentication that simply make SharePoint too complex. However, with this surge in Office 365 subscriptions, now pushing 100 million users’, there’s an incredible opportunity to start using SharePoint as an external collaborative tool. In this presentation we will show you real world examples of how organizations are using Office 365 for external collaboration including items such as branding, automatic document uploading, external sharing caveats, and, of course, custom application development.

RegisterNowButton

Food and Beverage Sponsor

learnit421x42

We would like to thank Learn iT! for sponsoring food and beverage at this meeting. Please take a moment to learn more about our sponsor.

Learn iT! has provided SharePoint training to businesses globally since SharePoint’s beginning. Working closely with SharePoint developers internally, we custom programs that can better focus on the needs of each identified group.

  • Classes taught at your location in your Environment
  • Classes at our office in Santa Clara, San Francisco , New York or Live Online
  • Audience: IT Pro-Developer-Site Admins-End Users

Common Scenarios which Initiate a SharePoint Learning Program

  • Migration to a newer version of SharePoint
  • Introducing SharePoint new to your organization
  • When re-releasing or the second introduction of SharePoint because of lower adoption
  • Using SharePoint to help improve current process
  • Document Management Solutions; Records Management, Intranet

Why Choose Learn iT!

  • Training Experts & SharePoint Experts
  • Experienced in helping other organizations with similar situations
  • Train in your own environment
  • Flexible in delivery style whether you learn hands on or need to just attend a few seminars
  • Customizing content to target specific to your goals

Learn more about Learn iT! SharePoint training at www.learnit.com/sharepoint

As a Microsoft Gold Partner, Learn iT! understands the robust and dynamic nature of the IT world. We consistently offer the most in-demand certification courses to help your network infrastructure operate efficiently and migrate seamlessly.

RegisterNowButton

SharePoint Saturday Sacramento 2014 – Slide Deck

The SharePoint Saturday Sacramento 2014 event was a great success this year.  I’m not sure what the final head-count was, but there were a lot of attendees, great sessions and vendors.

Thank you, again, to everyone who made this such a great event…

My session was on the topic of Records Management and SharePoint 2013.  Here is the deck I delivered.

SPMDA – Records Management and SharePoint 2013

SMPDA - Records Management and SharePoint 2013 - Slide 1

Speaking at SharePoint Saturday Sacramento – May 31, 2014

I am honored to be chosen to speak at the SharePoint Saturday Sacramento (#SPSSAC) event, on May 31 of this year; just a couple weeks away.

I will be speaking on the topic of Records Management and how to accomplish Records Management tasks using Microsoft SharePoint 2013 technologies.  I will describe what records management is, and demonstrate a number of features that can be used, in SharePoint 2013, to meet your organizations records management initiatives.

There is a great line-up of speakers at this free event.  Take a look here for a list of all the fantastic sessions that will be delivered.

I would like to thank all of those who have given their time to organize this event.  I know what it takes to do so and it is a lot of work.

I would also like to thank the sponsors; without them, this wouldn’t be possible!

If you haven’t already done so, PLEASE REGISTER!

 

SharePoint Saturday Sacramento – May 31, 2014

 

spsevents250x250SHAREPOINT SATURDAY

May 31, 2014

Sacramento

 

The Northern California SharePoint Community is proud to host the 3rd SharePoint Saturday Sacramento

Join fellow SharePoint colleagues in attending over 25 sessions presented by MVPs, MCMs and other SharePoint leaders. Topics include:

  • SharePoint 2013/2010
  • Office 365 / SPO
  • Business Intelligence
  • Case Studies
  • Search
  • Social & Mobile with SharePoint Reporting
  • And more …

This event is FREE and open to the public. Network, learn and immerse yourself in tips, tricks and ideas from leading SharePoint Experts from around the country.

WHERE & WHEN
Patrick Hayes Learning Center
2700 Gateway Oaks Drive
Sacramento, CA 95833

Saturday, May 31, 2014
8:00 AM to 5:00 PM (PST)

REGISTRATION
Registration is required to attend.
http://spssac2014.eventbrite.com

Admission: Free

Continental breakfast, snacks and lunch are included. Parking is free, but space is limited.

Register now at http://spssac2014.eventbrite.com

Northern California ARMA Seminar Panel Member

On March 6, 2014, I will be a panel member at the Northern California ARMA Seminar.  I am honored the Northern California ARMA would invite me to be a panel member and am looking forward to it.

At this seminar, we will be focusing on Records Management and I have been asked to attend and share my experiences with Records Management in SharePoint.

This should be very informative and I hope to see you there!

Register for this Event Here

Who is ARMA?
ARMA International is a not-for-profit professional association and the authority on governing information as a strategic asset.

The association was established in 1955. Its approximately 27,000+ members include information managers, information governance professionals, archivists, corporate librarians, imaging specialists, legal professionals, IT managers, consultants, and educators, all of whom work in a wide variety of industries, including government, legal, healthcare, financial services, and petroleum in the United States, Canada, and more than 30 other countries around the globe.
ARMA International offers invaluable resources such as: •Legislative and regulatory updates
•Standards and best practices
•Technology trends and applications
•Live and Web-based education
•Marketplace news and analysis
•Books & videos on managing records and information
•Global network of 27,000+ information management professionals and more than 10,000 professional members

ARMA International publishes Information Management magazine, the only professional journal specifically for professionals who manage information as part of their job description. The award-winning IM magazine is published bi-monthly and features articles on the hottest topics in information governance today, as well as marketplace news and analysis.

The association also develops and publishes standards and guidelines related to records management. It was a key contributor to the international records management standard, ISO-15489.

Factors that Drive the Success of Information Use

There are many factors that comprise the successful implementation of a SharePoint solution.  Regardless of what it is you are building, the following key success factors apply.

  1. Ease the Pains Associated with Storing and Managing Information
  2. Provide Meaningful and Accurate Information to Consumers
  3. Promote the Agile Evolution of Information Use
  4. Governance
  5. Education

Each of these Key Success Factors is described in more detail below.

Ease the Pains Associated with Storing and Managing Information

First and foremost, your information must be stored in a manner that is consistent, easy to understand and easy to maintain.  Without information, you wouldn’t have the need for a content management solution and without easing the pains associated with storing and managing this information, Contributors will be less likely to use the solution.

There are specific techniques that can be used when implementing SharePoint that can simplify the contribution and maintenance of information.

Content Categorization – We use content categorization to understand the types of content being stored which aids with:

  • Reducing the question “where do I store and manage this content”,
  • Provides a means to preset metadata with default values; reducing the metadata values that must be manually entered by a user, and
  • Search scoping, information aggregation and metadata filtering.

Without adding some level of structure to your content, you will accomplish little.  Storing and managing your content using the same approach as what was found on file shares adds zero value.  You will eventually end up with hundreds, if not thousands, of sites.  Many of those sites will have document libraries with many folders and sub-folders.  Users will quickly be uncertain to where content should be stored.  Managing unstructured content without any level of categorization will always lead to diminishing adoption, frustration and distrust in the technology.

Take the time to categorize your content, apply grouping principals and governing rules for appropriate use and you will be working towards a solution users adopt and become more efficient executing day to day business operations.

Understanding – The implementation team must have a thorough understanding of the content to be managed; thus implementing a solution that is consistent and eliminates ambiguity.

For example, the term Contract can have many definitions in an organization; Employee Contract, Customer Contract, Vendor Contract, etc.  These contract examples have entirely different meanings, uses and associated processes.  By thoroughly understanding the information, a solution can be appropriately architected to eliminate this type of ambiguity.

Responsibility – The responsibility of information belongs in the hands of the information owners.  When we implement solutions that aid in the ownership and responsibility of managing information, that information is more likely to be fresh, up to date and accurate.

Process – Information only has real value when it is used in actions and to facilitate decisions:

  • Improved decision-making
  • Simplify work and information flows
  • Achieving action plans and change initiatives
  • Developing information value chains
  • Maximizing use of information

Making process an integral part of architecture is a great way of adding value to your business.

Metadata – Metadata allows you to store and manage instance specific information for your content.  For example, having the ability to store the date of a meeting with the meeting minutes document will allow you to quickly locate that meeting document by meeting date.  And, storing the value of a contract with each customer contract document would allow you to filter all customers that have executed a contract greater than $100,000; which might aid with a marketing effort.

Many fear the use of metadata as there is a belief it is too much work for the user when uploading a new document or adding a new item to a list.  Various techniques can be utilized to overcome these fears.  These techniques include (but are not limited to):

  • Using appropriate information architecture, categorization and grouping principals, will allow you to configure default metadata values.  This eliminates the need for a user to input the metadata value but allows it to be present for search, faceted filtering, aggregation, workflow decision and triggers, etc.

An example of this would be a Project Number metadata column associated with a project document.  If you have all documents, associated with a specific project, managed in a site, you already know the project number and can configure a default value; i.e. the Project Number.

  • User education is also a very important technique.  The more your users understand the value of metadata, the more likely they will be to ensure metadata values are accurate.

Demonstrate the value of metadata to your users by setting up a POC and show them search results with and without the use of metadata.  Demonstrate faceted filtering and advanced search with and without the use of metadata.  It is a powerful message.

Metadata is also the primary means by which users can filter aggregate views of content; this is known as faceted filtering.

Provide Meaningful and Accurate Information to Consumers

Virtually all employees in your organization will consume information to support their specific day to day business needs.

The consumption of information will be determined by a specific contextual need.  For example, an employee may wish to search your Intranet for the latest parking policy or a project team member may wish to find notes for a meeting that occurred two months ago.

When considering the consumption of information, we also must consider how it will be presented to those consumers; search results and aggregate views are but only a few that will be required.  Other means of presenting information will include dashboards, printed reports, charts, etc.

The only means by which we can query information and produce accurate search results, faceted filtering, aggregate views, dashboards, printed reports, charts, etc. is by first architecting the information; so we know what it is we are querying.  Without this, none of this will be possible!

Promote the Agile Evolution of Information Use

There are many factors that drive the adoption and evolution of information use.  Of these, the most critical include:

Managing the Change of Information – Information is changing on a regular basis; project documents change and evolve, employee handbook and corporate policies are in a constant state of evolution.  Architecting your content management solution to facilitate the evolution of information is critical.

Work-in-Progress versus Published Information – During the process of information creation and collaboration, the state of that information is considered work-in-progress.  The solution must provide the means to determine work-in-progress versus published.  In some cases this is delivered using content versioning or security.  In other cases it requires a more complex means of disseminating the information.

Keeping Information Fresh and Up to Date – It is the responsibility of information owners to keep information fresh and up to date.  If the information in your content management solution is allowed to become stagnant and out of date, consumers are less likely to visit your Intranet and information consumption.

Versioning – An advanced content management solution provides the ability to version your content, view and roll-back to previous versions.  The lifecycle of content management requires this functionality.

Archival and Destruction – When information has reached the end of its lifecycle, often it is archived and/or destroyed.  This applies to all information in your environment, not just corporate records.  Thoroughly understanding the lifecycle (creation, evolution and eventual destruction) of information is a critical element of information architecture.

Governance

Governance of your information management solution comes in many forms; policies, procedures, processes, guidelines and education.  Without governance all of your hard work architecting a solid solution will eventually crumble.  The negative consequences are too many to list in this article.

The primary elements of information and solution governance falls in the following categories:

Infrastructure Governance – In most organizations, there is sufficient infrastructure governance; server builds, patches and update schedules, SLA’s and so on.  The infrastructure to support your SharePoint installation will mostly fall within this existing governance.

Solution Configuration and Customization Governance – Configuration and customization governance is equally important as the infrastructure governance.  Implementing processes and controls to appropriately manage change, meeting up-time SLA’s, is critical.

Solution Use Governance – It is here where most implementation teams fall short.  Governing the use of your SharePoint implementation is a means to keep configuration consistent.

Often I hear a department director or manager demand they have administrative rights to their department site.  In most cases, giving a non-technical user administrative rights to anything in SharePoint, is a huge mistake.  There are so many things that can, and will, go wrong in this scenario.  Here are a couple examples:

  • Because the non-technical user doesn’t understand the security model, they inadvertently give access to the wrong users.  I have see employee salary information exposed in this exact situation.
  • Consider a non-technical user creating a new document library and not using the data types (content types) your implementation team configured.  Now that same non-technical user uploads policy documents to this new library.  These policy document won’t be aggregated to your corporate policy book nor could they be searched (and filtered) as other corporate policy documents are.

These are but a couple examples of insufficient use governance; there are many more negative consequences.

My recommendation is to lock your implementation down tight as can be; only allowing the implementation team to make modifications at first.  Implement a federated use governance model as you educate users with the appropriate skills to make modifications that meet your best practices, policies, procedures and guidelines.

Without governing the implementation, customization and use of your information management solution, it will evolve in an inconsistent and out of control manner and quick become of little use to the business.

Education

Education at all levels is of paramount importance.  In most business environments your users have been driven to use file shares, local drives, email systems, etc. to store and manage their information.  Moving your culture from “the way we have always done it” to a completely new environment is a very large chasm to cross.  Users must be appropriately educated to trust and use the new system.

Conclusion

As you can see, there are many factors that contribute to the successful implementation, adoption and use of a SharePoint content management solution.  Installing SharePoint, configuring a few sites and throwing the keys to the kingdom over the wall will fail every time.  On the other hand, implement sound information architecture techniques, governance and education will lead you down a path of success!

If you are in need of help with your SharePoint implementation, Information Architecture, education or training, please don’t hesitate to contact me; this is what I specialize in!

 

SharePoint 2013 Task Management

Task Management

The task management functionality available in SharePoint 2013 has many new and improved features. Features such as the ability to create sub-tasks, a new timeline view, more robust management of tasks using Microsoft Project 2013 and so on. In this article I will take a deeper look in to these new and enhanced features of SharePoint 2013 Task Management.

For the purposes of this topic, I have created an “out-of-box” site using the Project Site template. No customizations have been made to the site, libraries or lists. The figure below is our demonstration site, using the standard Project Site template.

clip_image001

The figure below shows the standard Tasks list that is included with the Project Site template. The list itself is very similar to what we found in earlier versions of SharePoint. However, you will notice a timeline view is now available. More about the timeline view later.

clip_image002

Task Assignment

An improvement in SharePoint 2013 task management is the ability to assign multiple resources to a task. In previous versions, you could only assign a single resource which made using the “out-of-box” task management cumbersome when managing more complex projects.

clip_image003

Subtasks

You now have the ability to create subtasks which is another new feature found in SharePoint 2013. I believe this was primarily implemented to support Microsoft Project integration. More on Microsoft Project integration below.

To create a subtask, click the parent item context menu and select the Create Subtask option.

clip_image004

The figure below shows our task list after adding 2 subtasks.

clip_image005

Timeline View

The timeline view provides you with the ability to display specified tasks in a timeline, or Gantt, style view. The timeline view can be enabled or disabled for any task list in SharePoint. This is accomplished by editing the web part properties and checking/unchecking the Show timeline option.

clip_image006

You can add and remove tasks from a timeline by clicking the task item context menu and selecting Add to Timeline or Remove from Timeline option.

clip_image007

In the figure below, you can see that I have added the Human Resources Intranet Requirements task to the timeline. You will find the timeline view is best suited to display high-level tasks and milestones. If you add too many tasks it will become cluttered and difficult to read.

clip_image008

Aggregating Tasks to Your Personal Site

One of the best new features available in SharePoint 2013 is the aggregation of tasks, assigned to you, in your personal site. This provides you with a view of all tasks, across the entire farm, aggregated in one place. This means, regardless of where a task resides (a department site, project site, team site, etc.) they will all be aggregated to your personal site.

To accomplish this in the past, we had to customize a solution using search web parts, create a custom developed solution or purchase a 3rd party tool.

Note: The personal site task aggregation feature utilizes the SharePoint FAST search engine to index task information and make it available for querying. This means, tasks will not be aggregated to your personal site until the search indexer has processed the tasks.

To see an aggregate view of all your assigned tasks, simply go to your personal site and click the Tasks link in the left vertical navigation area.

In the figure below, you can see the tasks assigned to myself.

clip_image009

You might have also noticed, when you click a task item context menu, you have the ability to edit the task and go directly to the containing site or list. The is very convenient for managing tasks that reside in many locations of your Intranet.

Managing Tasks with Microsoft Project

Microsoft has made the management of SharePoint tasks, in Microsoft Project very convenient. To open a task list in Microsoft Project, simply navigate to the task list and click the Open with Project option on the ribbon bar.

clip_image010

Now that your SharePoint task list is open in Microsoft Project, you can use all of the features available in Microsoft Project.

clip_image011

After you make updates, in Microsoft Project, and save them, all of those updates are posted to the SharePoint task list.

Note: You may be asking yourself, Microsoft Project has many more features and fields then a SharePoint task list; is this information lost when you close Microsoft Project? The answer is no! Microsoft Project and SharePoint are tightly integrated and the actual project file is also saved to the site; in the Site Assets library.

clip_image012

Extending Microsoft Project Fields to a Task List

There may be situations when you wish to publish additional information from Microsoft Project to your task list in SharePoint. This can be easily accomplished by mapping fields using the following steps.

For the purposes of our demonstration, I will map the Duration field from Microsoft Project to our task list in SharePoint.

  1. In Microsoft Project, click the File tab in the ribbon bar.
  2. In the Info section, click the Map Fields button.

clip_image013

  1. In the Map Fields dialog, click the Add Field button.
  2. In the Add Field dialog, for the Existing Project Field, select Duration from the drop-down list.
  3. For the New SharePoint Column, leave the default as Duration.
  4. Click the OK button when you have completed these steps.
  5. Click the OK button in the Map Fields dialog when you have mapped all the desired fields.

Now the Duration field in Microsoft Project is mapped to a new Duration field in the SharePoint task list. Save your project and the SharePoint task fields will be updated, containing the mapped field values.

As you can see in the figure below, the Duration field now contains the duration information from Microsoft Project.

clip_image014

Once you start editing and managing your project tasks in Microsoft Project, I have learned that it is easier to continue using Microsoft Project; instead of switching from editing tasks in SharePoint some of the time and in Microsoft Project other times.

Conclusion

As you can see there are many new and improved task management features found in SharePoint 2013. The ability to assign more than one resource to a task and the timeline view are of my personal favorites. In addition, I use the Microsoft Project integration on virtually all project management scenarios I am involved with.

It is important to note; all of the features described in this article are available in Office 365. I use Office 365 to run my business and manage all client projects and these task management features and Office integration I use on a daily basis.

If you are interested in discussing these features, obtaining a demonstration of how they may improve your task management needs in your environment, please Contact Me. If you have any questions or comments, please leave a comment below; I will answer as I have availability.