Archive

Archive for the ‘Administration’ Category

SP1 Upgrade issues

October 27, 2011 Leave a comment

This post is for you if

* Build version was “14.0.5139.5003” before SP1 upgrade

* You get following error “There is no user table matching the input name ‘AllSites’

* Or you get Upgrade object too new (build version = 14.0.5139.5003, schema version = 4.1.7.0). Current server (build version = 14.0.6029.1000, schema version = 4.1.6.0).

Root Cause:

Sometime in May there was a hotfix for a discussion board issue(KB2547226). After this hotfix the Build version of the Config DB will be 14.0.5139.5003 & DB Schema version number is 4.1.7.0. You can check this schema version number on any of the content database table named versions. From one of the error message above it is very clear that this version is greater than what is coming from SP1.

But there are high chances that your admin content database schema version number is 4.1.10.0 (bcose some remedy steps what we attempt makes this possible, but still of no use to us)

Solution:

If you have installed the Hotfix then you don’t have choices.  As of today (11-Oct-2011) there is no supported way to overcome this issue other than modifying the DB directly (i know it is unsupported, but you get this solution even from Microsoft support).

So we have to manually modify the schema version to something lesser than 4.1.7.0   4.1.10.0 (actually to previous version number). It is highly recommended to backup the database before attempting any of these. Please try this in lower environment first (involving premium support is the best thing to do as they might have better solution as days goes on)

IF NOT EXISTS (SELECT * FROM sysobjects WHERE id = object_id(N'[dbo].[AllSites]’) AND OBJECTPROPERTY(id, N’IsUserTable’) = 1)

BEGIN

UPDATE Versions SET Version = ‘4.0.148.0’ WHERE Version = ‘4.1.7.0’ AND VersionId = ‘6333368D-85F0-4EF5-8241-5252B12B2E50’

END

Execute the above script on all content databases including admin content database. Don’t execute this on service application db’s & config db’s.  You might have to change the where clause based on the current value.

Once this is executed, run the config wizard and it should get completed successfully.

Hope this helps someone.

SPWebConfigModification Class

May 27, 2011 2 comments

We wanted to use SPWebConfigModification class to modify the web.config of web applications (some entries are made manually already).  Also different entries for different web applications. All these in powershell 🙂

Here is our important finding before we proceed..

  • The entries which are made in web.config that are not through this class cannot be modified/deleted using this class.
  • You can apply the modification only a particular web application by calling WebAppObject.WebService.ApplyWebConfigModifications() like our last line in the below given powershell
  • Also test to remove the entries through this class to confirm our entries are getting removed safely and to identity possible bugs in making the entries.

-> .

So, clean this mess… we have to first clear those entries manually and then put it back again through SPWebConfigModification. Now i want to read the entries from a xml file so that it easily manageable and easy to ask someone in production to execute it.

<WebConfigModifications>
<!–entry for caching config section–>
<WebConfig>
<NodePath>configuration/connectionStrings</NodePath>
<NodeValue>
<add name=”Caching1″ connectionString=”Initial dd Catalog=Caching;Data Source={Server name};Integrated Security=false;Uid=sqlCachingUser; Pwd=p@ssw0rd” providerName=”System.Data.SqlClient” />
</NodeValue>
<NodeKey>add[@name=”Caching1″]</NodeKey>
</WebConfig>
</WebConfigModifications>

Now comes our powershell

Write-Host “Enter WebApplication”
$IURL = Read-Host
Write-Host “Enter Config.xml file name”
$IConfigName = Read-Host
$IWebApp = [Microsoft.SharePoint.Administration.SPWebApplication]::Lookup($IURL)
Write-Host “Reading the Config.xml file..”
$configFile=[xml](get-content $IConfigName)
Write-Host “Config.xml file read”
[System.Reflection.Assembly]::LoadWithPartialName(“System.Xml”)
foreach($config in $configFile.WebConfigModifications.WebConfig)
{
$Entry = new-object  Microsoft.SharePoint.Administration.SPWebConfigModification
$Entry.Path = $config.NodePath
$Entry.Name = $config.NodeKey
$Entry.Type =[Microsoft.SharePoint.Administration.SPWebConfigModification+SPWebConfigModificationType]::EnsureChildNode
$Entry.Owner = ‘APlainTextisEnough’
$Entry.Value =  $config.NodeValue.get_InnerXml()
$IWebApp.WebConfigModifications.Add($Entry)
}
$IWebApp.Update()
$IWebApp.WebService.ApplyWebConfigModifications()

now execute this powershell script in SharePoint 2010 management shell. Provide the first input as the web applicaiton url for which you want to enter the modifications. And secondly provide the path to .xml file which was stored earlier like .\config.xml.

That’s it.. your modification will reflect only to the web application what you have provided.

[Update 1] If you try to apply a webconfig entry and if that is not going through successfully then most likely that entry remains in configuration database. And when you try to apply webconfig modifications sometime later, the older entry will be still attempted in web.config file. So to check what are the webconfig modifications that are being applied you can check through

$WebApp.WebConfigModifications

Through this, you can see what are the modifications that are being applied, but may not be actually present in web.config file.

Download .xml & powershell file from here

SharePoint Cache

May 19, 2011 Leave a comment

Caching is very important to improve the performance of a site. When it comes to SharePoint, it is a kind of grey area because you don’t know how the cache headers are set or on what logic are they set. SharePoint set’s the cache header based

case 1:  Content delivered from file system (typically _layouts folder)

case 2:  Contents that are delivered from Content DB (like Doc library, Picture Library etc)

In case 1, the files that are delivered are not private. And will have max-age value set to some number (more than 30 days i believe). And all the subsequent requests to these resources will be delivered from cache. Unless until you clear the cache most of the content are delivered from cache.

In case 2,  there are two sub scenario’s one with BLOB cache enabled & one without BLOB cache enabled.

First without BLOB Cache enabled, the cache headers delivered will be something like ‘Private, Max-age:0’ & expiration will be 15 days before current date. (Don’t know why is that so). This means that cache is private (meaning that user can cache this content) and all subsequent request to this resource will be used from cache after checking with server that the content is not modified.

You can notice HTTP status codes 304 for these requests which confirms that content is not modified. Each content is generally delivered by server with ‘LatModified’ date. This means a round trip to the server is involved but no data is transferred if the content is not modified.

Even this ‘304’ status involves some operation on the server. As the content are stored in DB, only a DB query can tell if the content has modified after a specific time. So, when targeting large users (may be in ten’s of thousand) avoid this is also a good idea. And enabling BLOB cache is best option.

Now with BLOB cache, the cache is public and max-age is ‘86400’ which is 24 hours. And all the subsequent request for these resources will be from cache unless you clear the cache or if you do ctrl+f5. In this way a round trip to the server is avoided to check if the resource has been modifed or not.  And this involves some DB operation as described before.

TIP: When using Fiddler don’t press F5 to analyze the cache, because F5 behaves differently and our Content db items will be need to be verified for latest version before serving to the browser.

Hope this helps someone.

Flushing BLOB Cache

May 17, 2011 Leave a comment

We will not go through what is BLOB cache as you can find so many articles for that.  We need to flush the BLOB cache from becuase it may not work as expected and we need to flush the BLOB cache. The Microsoft article what you will get is to do this will be

http://technet.microsoft.com/en-us/library/gg277249.aspx

But this article, i believe is missing to say one crucial information (as on 17th May 2011).  And information is that you have to execute this script on all WFE’s in your FARM. Otherwise, it is gets flushed only on the server where you execute.

One tip to identify that BLOB cache is working as expected

The images that are stored in picture library and being referenced in the page will not have cache header as “public”, if it is private or if the max-age is empty then BLOB cache is not working as expected and you flush BLOB cache. In this picture you can see,  a image named TN.jpg has cache header as ‘Private, max-age=0

Once BLOB cache is enabled and cache is flushed as mentioned in previous technet article (in all WFE’s) then the header to be expected is ‘Public’ with some max-age set.

Hope this helps someone.

Know the Build Version of SharePoint 2010

May 13, 2011 Leave a comment

Some times , you forget what CU has been installed in the FARM. At time you are given just the build version of the FARM  and you need to have same version in your environment.

Every time when i hit this situation, i use google to find it out, so here is the link (so i am not taking credit for someone’s work) where you can find build version till April 2011 CU

http://www.sharepointedutech.com/2010/09/06/sharepoint-server-2010-patch-levels-and-cumulative-updates/

Categories: Administration

Simple SQL query to know who created Site Collections in content DB

April 6, 2011 Leave a comment

We have multiple test environments, and one of our requirements is different site templates. So our testers and users in order to test their site templates, they keep creating site collections at an average of 150 per month.  Since our test environment is not designed to scale these levels. One of the impact is search service.

I could have enabled site usage confirmation & deletion, and some times it is annoying as we already have so many site collections in use. So i have used a SQL query against the content database to give me the site names. Then i create a batch file to delete all those sites

Select AA.SiteID, AA.tp_Login,AA.tp_Title,BB.FullUrl,AA.TimeCreated from (SELECT S.Id as SiteID,S.OwnerID,U.tp_Login,S.TimeCreated,U.tp_Title
FROM [content_databasename].[dbo].[Sites] S
inner join [content_databasename].[dbo].[UserInfo] U
on S.Id = u.tp_SiteID and S.OwnerID = u.tp_ID
where S.TimeCreated > ‘2010-07-12 09:38:33.000’) AA
inner join [content_databasename].[dbo].Webs BB on
AA.SiteID = BB.SiteId and BB.ParentWebId Is null
where FullUrl like ‘sites/test%’

This could be handy for some administrators. This gives you the site url, owner name, date when create. Additional where clauses can be included based on the need.

You can use BCS to get this data into a sharepoint list.

Categories: Administration

CMIS and SharePoint 2010

March 31, 2011 Leave a comment

CMIS (content management interoperability services) is now catching up. What is this CMIS do? this like a protocol defined by AIIM how different ECM systems can speak to each other over SOAP.  Eg, you can actually store your documents in Documentum 6.7 and make SharePoint as front end tool for those documents.

Many organizations are looking towards this for building a cost effective solution. And SharePoint 2010 is officially supporting CMIS and to get the capabilities you need to have SharePoint 2010 administrative tool kit (not exactly the toolkit, but a solution package which comes as part of the tool kit).

This technet article helps you get started on this http://technet.microsoft.com/en-us/library/ff934619.aspx

This brings the capability to pull documents from any other CMIS compatible systems into SharePoint and also other way. This also opens a different case to share documents across site collections or across web applications within same or different SharePoint Farm.

CMIS consumer webpart in SharePoint still has to get matured, but it shows the potential capability of CMIS. If you just install spcmis.wsp solution which comes with SP Administration tool kit, you get all these capabilities.  Above mentioned Technet article mentions how to use this solution package & other webparts.

All the services which provides the data in the CMIS format are available only after the deployment of above mentioned solution.

It is also worth to be noted, Consumer webpart shows only the content which are of type Document or Folder.

This link is usefult when connecting to Alfresco repository from SharePoint

http://www.trentswanson.com/post/SharePoint-2010-CMIS-Consumer-Connects-to-Alfresco-Producer.aspx

Unknown facts about Target Audience

March 24, 2011 Leave a comment

This may be know facts for few.. 🙂

Recently we had a issue that Target Audience is not working in our test environments (also in our Production as well). It all started with a issue ‘targeting a web part to a AD group is not working’. During the investigation we found following issues

-> We could not search so may Security & Distribution Groups in the audience picker tool

-> Count of members displayed in the dialogue window is inconsistent.

-> Even compiling the audience in CA does not provide any improvements.

But this functionality was working fine before upgrading to 2010.  With the help of premium support ticket we have found the resolution that, even groups has to be imported along with users. But in 2007,  it was considered automatically that all groups and users in entire DC was imported. Here are the facts,

-> Along with OU’s which contains users, we need to select the OU which contains the security groups and Distribution lists

-> Make sure you have selected the options to import both users & groups in user profile service.

-> Audience picker does not connect to AD to get the groups & distribution list unlike people picker which connects to AD. Instead it connect to DB to get the groups.

-> No. of members in the group is also calculated on the fly from the database.

-> It takes couple of incremental synchronization in order to get everything in place ( i don’t know why, probably because we have 10’s of thousand groups).

Hope this helps someone.

Share SharePoint Logs over HTTP

January 21, 2011 Leave a comment

SharePoint 2010 Logging & Correlation feature is a very great thing. It is helping a lot for debugging lot of issues for the administrator. But it adds additional overhead to administrators, when development team keep asking for logs for a particular correlation id.  This becomes very true in environments lower than production.  How to save our own time from this :).

There are quiet a few blogs which explains BCS way of doing this

http://www.shillier.com/archive/2010/08/05/Using-Business-Connectivity-Services-to-Display-SharePoint-2010-ULS-Logs.aspx

It comes with an advantage of a centralized place of  SharePoint logs from all the servers in the FARM.  But in my scenario, BCS is very costly solution as DB size keeps increasing, also i want to share only my WFE log which developers are interested in. (Also i have an advantage only one WFE).

Keeping these in mind, i attempted to use a traditional method of sharing the entire logs folder through IIS website.

Create Application Pool

Name: Logs

Select the .Net Framework Version as “No Managed Code”, rest leave as it is. And then click OK.

Create IIS Web Site

Right Click Web Sites tab and Click on ‘New Web Site‘. Fill out the form, and select the application pool as ‘Logs’ which was created earlier. Here we identify the site by the host name. In my case i have given full server name as host name Eg. servername.domainname.com

Once the site is created, Enable ‘Directory Browsing’ for this site.

Now we are all most done.  But if you browse to the site you will see the files

Create New MIME Type

Select the ‘Logs’ web site from the Left hand side tree. Click on ‘MIME Types’. Click ‘Add’ and enter .log for the extension & MIME type as ‘text/plain’.

That’s it.

Now you should be able to access the logs over http. Like http://servername.domainname.com.

Hope this helps someone.




Categories: Administration

“The My Site of xxx is scheduled for deletion”

December 15, 2010 3 comments

We had strange issue (actually it is a feature of SharePoint) that when a user is moved out of organization (when his profile is removed by profile import) a mail is sent out to the manager of the user that ‘The My Site of xxx is scheduled for deletion’ and also it provide site collection admin access to that user’s personal site. And the issue is ‘Privacy’, user my site are personal site and should never be assigned to manager.

So, what we will attempt here is to stop

-> Stop the email sent to the manager

-> Stop the personal from being deleted

-> Stop the assigning the permission to the manager of the user

So, all the above activities are done through a Timer job ‘MySiteCleanUpJob’, which actually deletes the user profile and perform other activities including the above.  And thank fully this can be overwritten through code. Read the below link

http://msdn.microsoft.com/en-us/library/microsoft.office.server.userprofiles.mysiteprofilehandler.aspx

So, we can write a class which inherits IProfileEventInterface and overloads the method ‘PreProfileDeleted‘.  If this method returns true then the profile will be deleted in profile store, and let’s assume we need that to happen and we will return true always.

Once the class is written, what to do next, how to plug in that. Put the following code in a feature receiver

SPService service = SPFarm.Local.Services.GetValue<SPWebService>(string.Empty);
MySiteCleanupJob job = service.JobDefinitions.GetValue<MySiteCleanupJob>(MySiteCleanupJob.DefaultJobName);
job.ProfileDeleteEventHandler = typeof(<<ClassName>>);
job.Update()

So far, everything explained is theoretical, so please try this in your test environment.  Will update the post once this is experimented.

[Update 2] New class cannot be hooked with the above specified way. It can be hooked through “stsadm -o profiledeletehandler” i could not get the equivalent powershell command.

Watch this space.. will update once deployed and tested

[Update 3] Here is the class

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.SharePoint;
using Microsoft.Office;
using Microsoft.Office.Server;
using Microsoft.Office.Server.UserProfiles;
using Microsoft.Office.Server.Diagnostics;

namespace MyCompanyMySiteProfileHandler.MySiteProfileHandler
{
public class MyCompanyMySiteProfileHandler : IProfileEventInterface
{
public virtual bool PreProfileDeleted(UserProfile profile)
{
if (profile == null)
{
return false;
}
else
{
try
{
using (SPSite site = profile.PersonalSite)
{
if (site != null)
{
Microsoft.Office.Server.UserProfiles.MySiteProfileHandler handler = new Microsoft.Office.Server.UserProfiles.MySiteProfileHandler();
//By passing null we do not set the owner for personal site, but still schedule the site for deletion.
//this way reminder email will also not be sent, but the site will be deleted on 14th day from now
handler.SetMySiteOwner(profile, null);
}
}
}
catch(Exception oe)
{
//do something
}
}

//In any case return ture, to make sure the profiles does not remain unattended.
return true;
}

public virtual bool PreProfileDeleted(ServerContext context, UserProfile profile)
{
return true;
}

}
}