Scroll to first ASP.Net Validation Error on Page

15 June, 2011
function ScrollToFirstError() {
    Page_ClientValidate();
    if (Page_IsValid == false) {
        var topMostValidator;
        var lastOffsetTop;
        for (var i = 0; i < Page_Validators.length; i++) {
            var vld = Page_Validators[i];
            if (vld.isvalid == false) {
                if (PageOffset(vld) < lastOffsetTop || lastOffsetTop == undefined) {
                    topMostValidator = vld;
                    lastOffsetTop = vld.offsetTop;
                }
            }
        }
        topMostValidator.scrollIntoView();
    }
    return Page_IsValid;
}
function PageOffset(theElement) {
    var selectedPosY = 0;
    while (theElement != null) {
        selectedPosY += theElement.offsetTop;
        theElement = theElement.offsetParent;
    }
    return selectedPosY;
}

Thanks to: http://msdn.microsoft.com/en-us/library/Aa479045

Advertisements

Web Part Event Lifecycle Summary

8 June, 2011

!Postback:

  1. ctor()
  2. OnInit()
  3. OnLoad()
  4. CreateChildControls()
  5. OnPreRender()

Postback:

  1. ctor()
  2. OnInit()
  3. CreateChildControls()
  4. OnLoad()
  5. btnDoPostBack_Click()
  6. OnPreRender()

Comparing Microsoft’s SharePoint Online with a 3rd Party Offering

22 October, 2010

 OK, so hosted SharePoint solutions have always looked pretty limited to experienced developers and administrators, haven’t they? Plus Microsoft seems to be finally getting ready to re-launch its Online Services (including hosted SharePoint) as Office 365, bringing with it (if the screenshots are to be believed) an upgrade to SharePoint 2010. So there really isn’t much point taking the time to compare the current hosted SharePoint 2007 offerings from Microsoft’s current Business Productivity Online Services and another provider, is there? Well, never underestimate the power of a slow day…

I’ll say at this point that this is really all about some cool-ish things Ifound in Microsoft’s offering that, you never know, might just come in handy. I’ll also say that my comparitive offering is from a large hosting provider who I won’t name and I don’t mean to imply any criticism of their service – I’m merely using them as an example of the kind of off-the-shelf hosted SharePoint site than one often gets for free when signing up for hosted Exchange mailboxes, for example.

To dive straight into the Site Collection settings for the Microsoft and 3rd Party portals you can see where the key differences lie:

Microsoft's Site Collection Settings

Microsoft's Site Collection Settings

3rd Party Site Collection Settings

3rd Party Site Collection Settings

That’s right, where as the lower screenshot is pure WSS, from Microsoft we get some MOSS (standard edition) features, namely:

  • MOSS(ish) Search: Search Center, Custom Scopes
  • Site Directory
  • Usage Reports
  • Audit
  • Information Management Policies
  • MOSS Web Parts
  • Translation Libraries
  • Administrator-approved InfoPath Forms

 I think it’s safe to say that the improved search experience and the additional Web parts were the things that most caught my eye (although we still don’t have the ability to define new content sources). Really, what we have from Microsoft is MOSS without any Share Service Provider but if you are being asked to consider SharePoint online, it should be good to know where to go if you need/want these extra features.

One the subject of search, check out Microsoft sneaking in Federated Results from Bing!

Search Results

Search Results

(Don’t worry, you can remove/edit these controls if you want).

Another important plus for the Microsoft offering: We can enable the Publishing Infrastructure for our sites.

This gives us a major advantage if the requirement is to create sites that look a little more like Websites and a little less like the traditional WSS Team Site. In fact, in conjuntion with the additional Web parts it should be possible to create something that looks pretty professional.

A couple of other things from the Microsoft solution also impressed:

  • The ability to create multiple Site Collections.
  • Admin interface allows you to dynamically re-allocate Site Collection storage.
Does Microsoft win hands down? Not entirely, there were a couple of things in 3rd Party offering, that weren’t in Microsoft’s:
  • Custom Templates
  • Pre-built Visitors, Members, Owners groups for the Site Collection (OK, not a major job to create, but why not do it for us)?
Custom Templates
Custom Templates

I’m sure I might just have got lucky in picking a hosting provider that loads up it’s SharePoint sites with so many templates, but in this case we had even more than the ‘Fantastic 40’ (or whatever it’s called). From MS, we have no application templates at all. From the MS FAQ:

Where are the other “Fantastic 40” SharePoint templates? SharePoint Online only has a few of the “Fantastic 40” templates enabled. Enabling other templates is being considered for future releases of SharePoint Online.

What this means in practice is that you can still upload the older .stp Site Templates, but there is no facility to use .wsp solution-based templates. Given that like all current hosted solutions (as opposed to rented dedicated servers) we have no facility for uploading our own customisations as solutions, having a large collection of pre-built templates would probably turn out to be useful.



A note about the SharePoint Designer:

In both cases I was able to do the usual things like adding ASPX pages, editing a Master Page and messing about with the navigation.


SharePoint Document Library Mirroring

6 October, 2010
UPDATE: This project has now (finally) been ported to SharePoint 2010 (the source can still be found at http://datagilitydocmirror.codeplex.com/). Functionality is largely as described below, with differences including:
 
Mirroring is now enabled on a per-library basis with a Ribbon command and dialog to allow the user to toggle the setting.
 
 
 
 
Site-collection settings now take better advantage of SharePoint 2010 UI
 
 
(note: I really like the way that it’s easy to leverage pre-built functionality such as templated sections like the OK/Cancel buttons)
 
Other changes:
  • All settings are now stored in a SharePoint list rather than as a PersistedObject, mainly to support a lightweight/stealth implementation that could still be driven without the UI elements.
  • Microsoft’s Enterprise Library has been removed and logging is now to the ULS.
  • A Site Data Query now gets the Guids for the parent List and Web for each audited item, meaning we’re not relying on trying to parse urls (or similar) to get this information.
  •  Obviously, everything is now wrapped up in a nice VS2010 solution 🙂 which no longer includes the console app as it never really got used.

OVERVIEW
 
SharePoint Document Library Mirroring is an implementation of something akin to SourceSafe’s Shadow Folders (i.e it ensures that the most recent state of a site collection’s document libraries is replicated at a designated location on the file system).
The source can be found at http://datagilitydocmirror.codeplex.com/ and this post gives an overview of the solution. It should be noted that Document Library Mirroring (DocMirror) is somewhat odd in that it has been built as a Visual Studio / WSS 3.0 solution with the aim being to immediately migrate it to SharePoint 2010 (which will be the subject of another post in the near future). Consequently no further enhancements will be made on this branch of the code – these developments will occur in the SharePoint 2010 project.
So exactly what does DocMirror do? It allows an administrator to set a root folder for document mirroring and then replicates changes to documents within the site collection to folders at the same path relative to this root. E.g. If (as shown below), the shadow root is set to C:\Code\Datagility.Shpt.DocMirror\ShadowFolder with mirroring enabled if a document is added to the site collection’s ‘Shared Documents Folder’…
 

A Document in a Document Library

… then this document will be written to the folder  C:\Code\Datagility.Shpt.DocMirror\ShadowFolder\Shared Documents.
 

The Document Mirrored to the File System (Advanced Stuff!)

 If this document is subsequently updated (either the document itself or its properties – well, its name anyway) the document on the file system is replaced by the newer version and if the document is moved or deleted from this location it is deleted from the file system. Sub sites under a site collection and sub folders in a document library simply become folders under the shadow root E.g. C:\Code\Datagility.Shpt.DocMirror\ShadowFolder\subsite1\Documents\folder1
 
Why would you want to do this?
 
Well, the initial requirement was for a client new to SharePoint, but more importantly also new to SQL Server. The conversation went along the lines of “You want us to put all our documents where? What are we going to do when it stops working?”. So the shadow folder became their safety net in case they ever had to wait to restore a broken content database. However, I’ve also started to use it in conjunction with Live Mesh (which is called something else by now): Changes to SharePoint documents get played out to the shadow folder which also happens to be a ‘Mesh-ed’ folder and so these changes are further synchronised to my Live Desktop and every device that I’ve added to my Mesh. I’m sure I’ll think of other uses as well.
 
How does it work?
 
DocMirror works by reading the WSS Audit Log and replaying any captured changes to documents (it’s a log miner). This means that auditing must be enabled from the Central Admin site and the correct actions must be being captured before DocMirror has any useful work to do. See here http://msdn.microsoft.com/en-us/library/bb397403(office.12).aspx for more details on enabling auditing. With the correct events being written to the log, DocMirror uses a custom SharePoint timer job to periodically query the log and process any changes.
 

The Custom Document Mirroring Timer Job

 For now, the available configuration settings are pretty limited. An administrator can enable or disable mirroring and they can set the root folder. A planned enhancement is to allow individual libraries to be selected or deselected for mirroring. However, these settings are accessed from an admin page accessible from the Site Settings page.

The Admin Option in Site Settings

 The admin options are saved to SPPersistedObject objects, ensuring that they’re available to all WFE servers in a farm, but it’s worth noting that the timer job is scoped such that it only runs on one server (I wanted to avoid any potential conflicts arising from more than one instance of the job processing the log and accessing the file system at the same time). See here http://blogs.pointbridge.com/Blogs/morse_matt/Pages/Post.aspx?_ID=55 for a discussion on how this is achieved (and how the MSDN documentation seems to be incorrect in this case).

Document Mirroring Options

 As a slight aside, the solution also contains a console app from which you can access the same functionality as the timer job. It’s been useful in testing and development and I can imagine using it to carry out a ‘controlled’ one-off processing of a very large log as mirroring is established.

THe Mirroring Console App

I’m not going to go into too much detail as to exactly how the solution is built here as the code is freely available for download (see above) and I think it’s pretty easy to follow, but I will highlight a few key points. The solution is a Visual Studio 2008 solution that uses VSeWSS 1.3 to build the deployable WSP. All elements (the timer job, the admin page etc…) are deployed as features by this solution although I did run into issues with deploying from Visual Studio. The process became to package within VS and then run the generated Setup.bat from a separate console process.

The Mirroring Visual Studio Solution

 Unit tests are included as part of the solution and I must confess they are perhaps not quite as comprehensive as they could be (I haven’t done the code coverage analysis), but they are still pretty thorough and do allow each of the different elements of the logic to be tested independently. One aim I originally had when starting this project was for deployment to be as simple as possible, even in a multiple WFE farm. All was going well until I looked at how DocMirror would log its activity. I considered planning to log to SharePoint’s internal log, but couldn’t get past the fact that the documentation tells you that you’re not allowed to do that, so I fell back on the Enterprise Library. DocMirror uses Enterprise Library 4.1 Logging which makes it very simple to have processing activity logged as below…

A Log File!

… but it does mean:

  1. The a lot of information now needs to be written to the config file (on each WFE server) and I didn’t fancy trying to build it up using SPWebConfigModification objects because the limitations of this are well documented and…
  2. The config needs to be accessible to the custom timer job which runs under the Windows SharePoint Service Timer Server (OWSTIMER.EXE) so putting it in Web.config wouldn’t help anyway. It either needs to go in OWSTIMER.exe.config or Machine.config on whichever WFE has been designated to run the timer job and I haven’t yet figured out a way of neatly automating this deployment.
  3. We now have dependencies on the Enterprise Library binaries (3 of them, anyway) which need to be GAC’ed. What if another solution has already deployed another set of these binaries (signed with a different key)? Do we want to just keep adding multiple side-by-side assemblies?

I haven’t solved these problems in this version so some post-installation modification of config files is currently necessary.

All of which means that the planned enhancements for the next version are:

  • Look again at the admin page (it currently inherits from WebPartPage which means that it looks like a content page and not a settings page).
  • Create a robust deployment package.
  • Allow greater flexibility with regard to what gets mirrored.

[UPDATE: Planned enhancements deployed with 2010 version!]


Benchmarking a Very Small Index Server

20 September, 2010

Very limited capacity SharePoint 2007 Index Server (indexing own content – i.e. it is also a Web server):

1 CPU

2GB RAM

Index corpus = 1841

With unlimited concurrency (Crawler Impact Rules) and full crawl took well over 10 minutes – in fact, it had only indexed around 400 items in this time. Performance increased when Impact Rules were used to throttle crawling.

  • 2 Concurrent Requests: Full Crawl in 07:30 (min)
  • 4 Concurrent Requests: Full Crawl in 06:25
  • 8 Concurrent Requests: Full Crawl in 06:00
  • 16 Concurrent Requests: Full Crawl in 05:55

Not particularly scientific, but it looks like the ‘sweet spot’ has been found at 8 Concurrent Requests.


Forcing Profile Sync when using AD Groups

13 September, 2010

If the contents of the UserInfo table(s) has become out of sync with the Profile database for a given user and the Profile Sync job doesn’t pick this up the approach is usually to remove the user from the All People list for the Site Collection(s) and re-add them.

However, what if you’re using AD groups and the user doesn’t appear in the list?

Simple! Add them (to and group – e.g. Visitors) and then remove them. You should then find that the content db for the Site Collection has been updated.

UPDATE: See here http://msfarmer.blogspot.com/2009/11/user-profiles-changes-are-not-updating.html for an explaination of the relationship between this behaviour and WSS users being ‘inactive’.


Deleting ‘Rogue’ Documents/Pages from SharePoint Sites

3 September, 2010

When STSADM doesn’t pick up orphaned documents and you want to get rid of them from the content database…

DECLARE @pageName nvarchar(128)
DECLARE @dir nvarchar(256)
DECLARE @uid uniqueidentifier

SET @pageName = N’Holidays.aspx’
SET @dir = N’Resources/LocalInfo/TeamSiteFrance/Pages’

SELECT @uid = (SELECT [Id] FROM dbo.AllDocs WHERE LeafName = @pageName AND DirName = @dir)

DELETE [dbo].[AllDocStreams] WHERE [Id] = @uid

DELETE [dbo].[AllDocs] WHERE [Id] = @uid

DELETE [dbo].[AllUserData] WHERE tp_LeafName = @pageName AND tp_DirName = @dir