SharePoint Cache

May 19, 2011 Leave a comment

Caching is very important to improve the performance of a site. When it comes to SharePoint, it is a kind of grey area because you don’t know how the cache headers are set or on what logic are they set. SharePoint set’s the cache header based

case 1:  Content delivered from file system (typically _layouts folder)

case 2:  Contents that are delivered from Content DB (like Doc library, Picture Library etc)

In case 1, the files that are delivered are not private. And will have max-age value set to some number (more than 30 days i believe). And all the subsequent requests to these resources will be delivered from cache. Unless until you clear the cache most of the content are delivered from cache.

In case 2,  there are two sub scenario’s one with BLOB cache enabled & one without BLOB cache enabled.

First without BLOB Cache enabled, the cache headers delivered will be something like ‘Private, Max-age:0’ & expiration will be 15 days before current date. (Don’t know why is that so). This means that cache is private (meaning that user can cache this content) and all subsequent request to this resource will be used from cache after checking with server that the content is not modified.

You can notice HTTP status codes 304 for these requests which confirms that content is not modified. Each content is generally delivered by server with ‘LatModified’ date. This means a round trip to the server is involved but no data is transferred if the content is not modified.

Even this ‘304’ status involves some operation on the server. As the content are stored in DB, only a DB query can tell if the content has modified after a specific time. So, when targeting large users (may be in ten’s of thousand) avoid this is also a good idea. And enabling BLOB cache is best option.

Now with BLOB cache, the cache is public and max-age is ‘86400’ which is 24 hours. And all the subsequent request for these resources will be from cache unless you clear the cache or if you do ctrl+f5. In this way a round trip to the server is avoided to check if the resource has been modifed or not.  And this involves some DB operation as described before.

TIP: When using Fiddler don’t press F5 to analyze the cache, because F5 behaves differently and our Content db items will be need to be verified for latest version before serving to the browser.

Hope this helps someone.

Flushing BLOB Cache

May 17, 2011 Leave a comment

We will not go through what is BLOB cache as you can find so many articles for that.  We need to flush the BLOB cache from becuase it may not work as expected and we need to flush the BLOB cache. The Microsoft article what you will get is to do this will be

http://technet.microsoft.com/en-us/library/gg277249.aspx

But this article, i believe is missing to say one crucial information (as on 17th May 2011).  And information is that you have to execute this script on all WFE’s in your FARM. Otherwise, it is gets flushed only on the server where you execute.

One tip to identify that BLOB cache is working as expected

The images that are stored in picture library and being referenced in the page will not have cache header as “public”, if it is private or if the max-age is empty then BLOB cache is not working as expected and you flush BLOB cache. In this picture you can see,  a image named TN.jpg has cache header as ‘Private, max-age=0

Once BLOB cache is enabled and cache is flushed as mentioned in previous technet article (in all WFE’s) then the header to be expected is ‘Public’ with some max-age set.

Hope this helps someone.

Know the Build Version of SharePoint 2010

May 13, 2011 Leave a comment

Some times , you forget what CU has been installed in the FARM. At time you are given just the build version of the FARM  and you need to have same version in your environment.

Every time when i hit this situation, i use google to find it out, so here is the link (so i am not taking credit for someone’s work) where you can find build version till April 2011 CU

http://www.sharepointedutech.com/2010/09/06/sharepoint-server-2010-patch-levels-and-cumulative-updates/

Categories: Administration

Simple SQL query to know who created Site Collections in content DB

April 6, 2011 Leave a comment

We have multiple test environments, and one of our requirements is different site templates. So our testers and users in order to test their site templates, they keep creating site collections at an average of 150 per month.  Since our test environment is not designed to scale these levels. One of the impact is search service.

I could have enabled site usage confirmation & deletion, and some times it is annoying as we already have so many site collections in use. So i have used a SQL query against the content database to give me the site names. Then i create a batch file to delete all those sites

Select AA.SiteID, AA.tp_Login,AA.tp_Title,BB.FullUrl,AA.TimeCreated from (SELECT S.Id as SiteID,S.OwnerID,U.tp_Login,S.TimeCreated,U.tp_Title
FROM [content_databasename].[dbo].[Sites] S
inner join [content_databasename].[dbo].[UserInfo] U
on S.Id = u.tp_SiteID and S.OwnerID = u.tp_ID
where S.TimeCreated > ‘2010-07-12 09:38:33.000’) AA
inner join [content_databasename].[dbo].Webs BB on
AA.SiteID = BB.SiteId and BB.ParentWebId Is null
where FullUrl like ‘sites/test%’

This could be handy for some administrators. This gives you the site url, owner name, date when create. Additional where clauses can be included based on the need.

You can use BCS to get this data into a sharepoint list.

Categories: Administration

CMIS and SharePoint 2010

March 31, 2011 Leave a comment

CMIS (content management interoperability services) is now catching up. What is this CMIS do? this like a protocol defined by AIIM how different ECM systems can speak to each other over SOAP.  Eg, you can actually store your documents in Documentum 6.7 and make SharePoint as front end tool for those documents.

Many organizations are looking towards this for building a cost effective solution. And SharePoint 2010 is officially supporting CMIS and to get the capabilities you need to have SharePoint 2010 administrative tool kit (not exactly the toolkit, but a solution package which comes as part of the tool kit).

This technet article helps you get started on this http://technet.microsoft.com/en-us/library/ff934619.aspx

This brings the capability to pull documents from any other CMIS compatible systems into SharePoint and also other way. This also opens a different case to share documents across site collections or across web applications within same or different SharePoint Farm.

CMIS consumer webpart in SharePoint still has to get matured, but it shows the potential capability of CMIS. If you just install spcmis.wsp solution which comes with SP Administration tool kit, you get all these capabilities.  Above mentioned Technet article mentions how to use this solution package & other webparts.

All the services which provides the data in the CMIS format are available only after the deployment of above mentioned solution.

It is also worth to be noted, Consumer webpart shows only the content which are of type Document or Folder.

This link is usefult when connecting to Alfresco repository from SharePoint

http://www.trentswanson.com/post/SharePoint-2010-CMIS-Consumer-Connects-to-Alfresco-Producer.aspx

Unknown facts about Target Audience

March 24, 2011 Leave a comment

This may be know facts for few.. 🙂

Recently we had a issue that Target Audience is not working in our test environments (also in our Production as well). It all started with a issue ‘targeting a web part to a AD group is not working’. During the investigation we found following issues

-> We could not search so may Security & Distribution Groups in the audience picker tool

-> Count of members displayed in the dialogue window is inconsistent.

-> Even compiling the audience in CA does not provide any improvements.

But this functionality was working fine before upgrading to 2010.  With the help of premium support ticket we have found the resolution that, even groups has to be imported along with users. But in 2007,  it was considered automatically that all groups and users in entire DC was imported. Here are the facts,

-> Along with OU’s which contains users, we need to select the OU which contains the security groups and Distribution lists

-> Make sure you have selected the options to import both users & groups in user profile service.

-> Audience picker does not connect to AD to get the groups & distribution list unlike people picker which connects to AD. Instead it connect to DB to get the groups.

-> No. of members in the group is also calculated on the fly from the database.

-> It takes couple of incremental synchronization in order to get everything in place ( i don’t know why, probably because we have 10’s of thousand groups).

Hope this helps someone.

Share SharePoint Logs over HTTP

January 21, 2011 Leave a comment

SharePoint 2010 Logging & Correlation feature is a very great thing. It is helping a lot for debugging lot of issues for the administrator. But it adds additional overhead to administrators, when development team keep asking for logs for a particular correlation id.  This becomes very true in environments lower than production.  How to save our own time from this :).

There are quiet a few blogs which explains BCS way of doing this

http://www.shillier.com/archive/2010/08/05/Using-Business-Connectivity-Services-to-Display-SharePoint-2010-ULS-Logs.aspx

It comes with an advantage of a centralized place of  SharePoint logs from all the servers in the FARM.  But in my scenario, BCS is very costly solution as DB size keeps increasing, also i want to share only my WFE log which developers are interested in. (Also i have an advantage only one WFE).

Keeping these in mind, i attempted to use a traditional method of sharing the entire logs folder through IIS website.

Create Application Pool

Name: Logs

Select the .Net Framework Version as “No Managed Code”, rest leave as it is. And then click OK.

Create IIS Web Site

Right Click Web Sites tab and Click on ‘New Web Site‘. Fill out the form, and select the application pool as ‘Logs’ which was created earlier. Here we identify the site by the host name. In my case i have given full server name as host name Eg. servername.domainname.com

Once the site is created, Enable ‘Directory Browsing’ for this site.

Now we are all most done.  But if you browse to the site you will see the files

Create New MIME Type

Select the ‘Logs’ web site from the Left hand side tree. Click on ‘MIME Types’. Click ‘Add’ and enter .log for the extension & MIME type as ‘text/plain’.

That’s it.

Now you should be able to access the logs over http. Like http://servername.domainname.com.

Hope this helps someone.




Categories: Administration