Enabling the Document ID Service in a site collection creates a new Document ID column for documents and document sets in the site collection, which will contain a unique ID based on a configurable format.
Really nice feature to have to be able to uniquely identify your documents across a site collection and for the permalink function that comes with it (the document will have the same unique URL regardless of where the document moves to within the site collection).
However, when you search for a Document ID, you’ll get this: Nothing here matches your search…
We had an issue with a client where their SharePoint search crawls were periodically crawling successfully, but then on other subsequent crawls show a lot of warnings, saying “The claims cookie used by the crawler has expired.”
For this web application, we had the Default zone set up with Windows Authentication for the search crawler, and an ADFS Trusted Identity Provider for users to log in through a custom login page.
The client also had a requirement where logins have to expire after an hour, so the CookieLifetime setting in the Security Token Service Config for the farm had been modified to 1 hour from 5 days.
Added a new field (AddressLine1) to a list, and needed to display the field in all list views. It needed to always display before another field (AddressLine2).
Script below loops through all views of a list, and if the AddressLine2 field exists, it will add AddressLine1 just before it.
Note: Can’t use ForEach loop to go through each list view as the .Update() will throw an error, due to restrictions in updating collection items while enumerating.
Had to write a PowerShell script to go through a bunch a lists across different sites to enable Metadata Navigation and Filtering and include some Key Filters.
Set to 5000 by default, and should not be increased, unless you like slow performing SharePoint sites.
Reducing the list view threshold provides very minimal performance gains, so don’t bother.
- Accessing list views that contain too many items
- Pagination does not help, as the backend query is getting all results, and pagination only breaks up the query results
- Attempting to perform privileged operations (adding a column, creating an index, adding content types)
- Custom code that accesses a large list