Hmm...This Might Work

Solutions from a day long since past
posts - 15, comments - 7, trackbacks - 0

Wednesday, November 13, 2013

Using PowerShell to Check AD Schema

Here we are, a cold crisp 20 degree Wednesday in November. I thought to myself…this is not cool (no pun), but you know what is cool?  Yeah, I’m sure you guessed PowerShell’s ActiveDirectory module.

Just a quick blog note to show how PowerShell quickly settled a dispute during an upgrade of our AD schema to handle a Windows 2012 DC. Of course this wasn’t a big dispute, many other tools could have been used. The question was had the Schema already been changed to support a 2012 Server. Again, there are many tools that could provide the answer, but what made this so cool was being able to share the experience with others who didn’t know PowerShell could replace some of the old stand by AD tools. So this is more of an AH-HA moment that felt right to share (and the script)…All brought to us by PowerShell and the ActiveDirectory module.

(An academic honesty note here…this script is not 100% my own work...More like 5% – 10% my work, I can’t remember where I snagged the meat of this script so the credit remains unknown.)

#This script will query AD for the Schema Version of AD,Exchange and Lync. Can be ran as least privilaged user.

Import-Module ActiveDirectory

#Array

$SchemaVersions = @()

#AD Portion

$SchemaHashAD = @{

13="Windows 2000 Server";

30="Windows Server 2003";

31="Windows Server 2003 R2";

44="Windows Server 2008";

47="Windows Server 2008 R2";

56="Windows Server 2012"

}

$SchemaPartition = (Get-ADRootDSE).NamingContexts | Where-Object {$_ -like "*Schema*"}

$SchemaVersionAD = (Get-ADObject $SchemaPartition -Property *).objectVersion

$AdSchema = New-Object System.Object

$AdSchema | Add-Member -Type NoteProperty -Name Schema -Value $SchemaVersionAD

$AdSchema | Add-Member -Type NoteProperty -Name Product -Value "AD"

$AdSchema | Add-Member -Type NoteProperty -Name Version -Value $SchemaHashAD.Item($SchemaVersionAD)

$SchemaVersions += $AdSchema

#Exchange Portion

$SchemaHashExchange = @{

4397="Exchange Server 2000 RTM";

4406="Exchange Server 2000 SP3";

6870="Exchange Server 2003 RTM";

6936="Exchange Server 2003 SP3";

10628="Exchange Server 2007 RTM";

10637="Exchange Server 2007 RTM";

11116="Exchange 2007 SP1";

14622="Exchange 2007 SP2 or Exchange 2010 RTM";

14726="Exchange 2010 SP1";

14732="Exchange 2010 SP2";

15137="Exchange 2013"

}

$SchemaPathExchange = "CN=ms-Exch-Schema-Version-Pt,$SchemaPartition"

If (Test-Path "AD:$SchemaPathExchange") {

$SchemaVersionExchange = (Get-ADObject $SchemaPathExchange -Property rangeUpper).rangeUpper

}

Else {

$ExchangeErr = 1

}

$ExchSchema = New-Object System.Object

$ExchSchema | Add-Member -Type NoteProperty -Name Schema -Value $SchemaVersionExchange

$ExchSchema | Add-Member -Type NoteProperty -Name Product -Value "Exchange"

$ExchSchema | Add-Member -Type NoteProperty -Name Version -Value $SchemaHashExchange.Item($SchemaVersionExchange)

If ($ExchSchema.Schema -ne 0) {

$SchemaVersions += $ExchSchema

}

#Lync Portion

$SchemaHashLync = @{

1006="LCS 2005";

1007="OCS 2007 R1";

1008="OCS 2007 R2";

1100="Lync Server 2010";

1150="Lync Server 2013"

}

$SchemaPathLync = "CN=ms-RTC-SIP-SchemaVersion,$SchemaPartition"

If (Test-Path "AD:$SchemaPathLync") {

$SchemaVersionLync = (Get-ADObject $SchemaPathLync -Property rangeUpper).rangeUpper

}

Else {

$LyncErr = 1

}

$LyncSchema = New-Object System.Object

$LyncSchema | Add-Member -Type NoteProperty -Name Schema -Value $SchemaVersionLync

$LyncSchema | Add-Member -Type NoteProperty -Name Product -Value "Lync"

$LyncSchema | Add-Member -Type NoteProperty -Name Version -Value $SchemaHashLync.Item($SchemaVersionLync)

If ($LyncSchema.Schema -ne 0){

$SchemaVersions += $LyncSchema

}

#OutPut Section

Write-Host "Known current schema version of products:"

$SchemaVersions | Format-Table * -AutoSize

#I think this error handling is probably better off in the setting of the note property but this takes care of it for now

If ($LyncErr -eq 1){

Write-Host "Lync or OCS not present" -ForegroundColor Yellow

}

If ($ExchangeErr -eq 1){

Write-Host "Exchange not present" -ForegroundColor Yellow

}

#---------------------------------------------------------------------------><>

So there you have it, another way PowerShell rocks.

posted @ Wednesday, November 13, 2013 10:22 AM | Feedback (0) | Filed Under [ ActiveDirectory Powershell ]

Tuesday, November 12, 2013

Hiding Disabled Users From Exchange Address Book

The other day while reviewing an Exchange 2010 Environment, I noticed a few active mailboxes belonging to disabled users. For obvious reasons this isn’t a good thing, if for nothing else it clutters up the Exchange Address Book.

Next thought in my mind…So what’s the best way to hide these disabled users? Having the PowerShell bias that I do in fact have, I had to spend 15 minutes reviewing the options.

  1. Use a manual process. This would include disabling the user in AD, followed up with the steps described here.
  2. Use Exchange Address Book Policies(ABP). As indicated in this article, APB’s have a dependency on Exchange 2010 SP2. That said it seems like a viable and interesting approach.
  3. Use PowerShell. As I started from the outset, I’m biased right now…So a PowerShell only approach seems “more better”.

Here is the script I used in a resource / user environment. Keep in mind this is a down and dirty version, a proof of concept. I would limit the use of this example as an inspiration only. (good or bad)

#This script will query for all LinkedMailboxes when ran on an Exchange Server

#It will return a user set who show their Linked Master Accounts as disabled

#Use the results with "Set-Mailbox -HiddenFromAddressListsEnabled $true" to change

#all of the disabled users to hidden from the address book. Example Below

add-pssnapin Microsoft.Exchange.Management.PowerShell.E2010 -ErrorAction Continue

Import-Module ActiveDirectory

$linkmbx = get-mailbox -RecipientTypeDetails LinkedMailbox

$alcusers = Get-Aduser -Filter * -Server <your domain here> -Properties Enabled

$userrpt = @()

foreach ($mbx in $linkmbx){

$name = $mbx.linkedmasteraccount

$user = $name.split("\")

$alcuser = $alcusers | where {$_.samaccountname -eq $user[1]}

if ($alcuser.Enabled -eq $false){

$rpt = New-Object System.Object

$rpt | Add-Member -MemberType NoteProperty -Name Name -Value $alcuser.Name

$rpt | Add-Member -MemberType NoteProperty -Name Alias -Value $mbx.alias

$rpt | Add-Member -MemberType NoteProperty -Name HidFromAddBook -Value $mbx.HiddenFromAddressListsEnabled

$userrpt += $rpt

$rpt

}

}

Write-Host "There are" $userrpt.count "linked mailboxes with disabled user accounts in user domain"

<#

#Uncomment this section if you want to include changing the address book visability

Foreach ($user in $usrrpt){

Write-Host "Changing address book visability for" $user.alias

Set-Mailbox -Identity $user.alias -HiddenFromAddressListsEnabled $true

}

#>

Of course the next thought of automation comes to mind…but that’s a different post. 

posted @ Tuesday, November 12, 2013 3:15 PM | Feedback (0) | Filed Under [ Exchange 2010 Powershell ]

Thursday, November 07, 2013

Create SharePoint 2013 Result Source with PowerShell

In my continued automation efforts, I was looking to convert documentation provided from a consultant into something more…well…automated. In this first of four parts I’ll list out creating a Result Source with Powershell.

Subsequent posts (part 2 – 4) will give example of creating Result Types, Query Rules, and Search Navigation. The aim of this effort is to make use of PowerShell in rebuilding, essentially cloning without data, a search service application. This is useful when Microsoft support gives the classic solution of “rebuild” your service application. Doh!

It should be noted:

  • This will create the Result Source at the Site Collection level.
  • This isn't 100% my original work, it’s inspired (taken mostly and modified) from the SearchGuys blog post.
    • The blog had Bing and Federation as an example, this example is a local SharePoint Result Set to Query BCS

 

Add-PSSnapin Microsoft.SharePoint.PowerShell

#Change These Variables To Fit

$SPWeb = "Your SP Site Collection Here"

$resultSourceName = "Your Content Source Friendly Name Here"

$resultSourceDescription = "Description for (BCS) Data Source"

$qT = '{searchTerms?} (ContentSource="<Content Source Friendly Name Here>" IsContainer=false)'

#Begin The Process

$ssa = Get-SPEnterpriseSearchServiceApplication

$fedman = New-Object Microsoft.Office.Server.Search.Administration.Query.FederationManager($ssa)

$searchOwner = Get-SPEnterpriseSearchOwner -SPWeb $SPWeb -Level SPSite

$resultSource = $fedman.GetSourceByName($resultSourceName, $searchOwner)

#Check To See if it exists

if(!$resultSource){

Write-Host "Result source does not exist. Creating."

$resultSource = $fedman.CreateSource($searchOwner)

}

else { Write-Host "Using existing result source." }

#Finish It Up

$resultSource.Name =$resultSourceName

$resultSource.ProviderId = $fedman.ListProviders()['Local SharePoint Provider'].Id

$resultSource.Description = $resultSourceDescription

$resultSource.CreateQueryTransform($qT)

$resultSource.Commit()

posted @ Thursday, November 07, 2013 7:55 PM | Feedback (0) |

Working with SharePoint Web Parts using PowerShell

First things first. I’m not a developer. I seem to do ok working my way through the SharePoint object model with PowerShell and writing automation scripts, but that doesn’t make me a developer.

With that out the way, I do find myself in an odd place with developers neglecting (for whatever reason) to automate population of web parts in the content pages their solution has deployed. An example might be Search Content web part needing to have the proper display template selected for displaying conversations. Ah…My Dev friends (don’t hate me) but why not go that extra mile?  If I can do it through script, surely you can employ your superior coding skills to include it in the solution (wsp)!

For those of you who may find yourself in my shoes, here is a script I created to help ease that cross farm (environment pain). Essentially the script is a function with a few parameters. This is the core, from here you can customize to your needs. It’s a great starting point for anyone who wants to automate changes to web parts via PowerShell.  You can copy and paste the script below into PowerShell ISE and save it to whatever name you like

 

Add-PSSnapin Microsoft.SharePoint.Powershell

#----Start of Function-------

Function Set-WebParts {

[CmdLetBinding()]

Param(

[Parameter(Mandatory=$True,Position=0)]

[String]$SiteUrl = $(Read-Host "Please Enter Site URL"),

[Parameter(Mandatory=$True,Position=1)]

[String]$PageUrl = $(Read-Host "Please Enter Page URL")

)

$web = Get-SPWeb $SiteUrl

#+Get and Checkout Page

#+-Get Page

$page = $web.GetFile($pageURL)

#+-CheckOut The Page

$page.CheckOut()

#+-Load Limited Web Part Manager

$wpm = $web.GetLimitedWebPartManager($pageURL, [System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)

#+Change Conversations WebPart - In my example I had some webparts titled "Conversations" and some Titled "Content Search"

$wp = $wpm.WebParts | Where {$_.Title -eq "Conversations" -or $_.Title -eq "Content Search"}

#+-This is the base url for the template - Change this to whatever meets the need

$base = "~sitecollection/_catalogs/masterpage/Display Templates/Content Web Parts/"

#+-This is the template name

$template = "Item_Discussion.js"

$NewItemTemplateId = $base+$template

#+-Actually Setting The Part

$wp.ItemTemplateId = $NewItemTemplateId

$wpm.SaveChanges($wp)

#+CheckIn and Publish Page

$page.CheckIn("Scripted Change")

$page.Publish("Scripted Publish")

$web.Dispose()

}

#----End Of Function----

SetWebParts.ps1 

As I said, I’m not a developer so while functional, there is probably a better way to accomplish what I’ve published here.

posted @ Thursday, November 07, 2013 5:31 PM | Feedback (0) | Filed Under [ SharePoint Rants Powershell SharePoint 2013 ]

SharePoint 2013 Managed Metadata Service Application (MMSA) Gremlins

 

This post objective: To simply document something I can’t explain.

First the environment has a single WFE and two different app servers (APP1 and APP2). The SharePoint environment in question is running 2013 RTM bits (I know)…

The Timeline
  • Roughly 24 hours ago

An unplanned deployment of a custom farm solution. Mostly just an automated deployment of content pages with custom webparts. Nothing out of the ordinary here.

The standard post deploy testing revealed nothing out of the ordinary, other than Search and SSRS not playing well with each other in this production farm.

<rant>It seems the Microsoft support solution is to recreate the Search Service App, then SSRS will play nice in the logs. I’ll tell you this works but not what I’d call acceptable…</rant> 

  • 10 hours ago

Notification of Managed Metadata navigation malfunction by user

  • 6 hours ago

Start of troubleshooting MMSA. Service application interface in CA had error indicating “The Managed Metadata Service or Connection is currently not available. The Application Pool or Managed Metadata Web Service may not have been started. Please Contact your Administrator.”  Naturally I figured there was a stopped application pool, which there was, it just wasn’t one running this service.

Next I tried to open the service connection properties, only to get this error (from ULS).  Application error when access /_admin/ManageMetadataProxy.aspx, Error=Retrieving the COM class factory for component with CLSID {BDEADF26-C265-11D0-BCED-00A0C90AB50F} failed due to the following error: 800703fa Illegal operation attempted on a registry key that has been marked for deletion. (Exception from HRESULT: 0x800703FA). 

At this point I’m at a loss an figure I’ll try to restart the services via CA. Just for good measure I started the service on each server (WFE,APP1,APP2). Same results, nothing changed

Read a blog suggesting an unlikely event of the application pool need access to the service application. It worked well prior to this event, but for good measure let’s add it in.   Same results, nothing changed

  • 3 hours ago

Resigned to throw the hail marry of an IISReset, just to see if it will commit anything changed to this point. Sent notification to enterprise giving heads up of unplanned reset.

  • 2 hours ago

Getting Ready to go for lunch, figured I take a quick look before I throw the switch on the IISReset. Before checking the MMSA, I ran Get-CacheClusterHealth only to get an error “No valid cluster settings were provided with Use-CacheCluster”.  Not a big deal, anticipated this so I ran Use-CacheCluster then Get-CacheClusterHealth once more. This time I received the expected Cluster health statistics. Getting somewhat anxious to make some headway I figured I’d flip back over to the MMSA to make sure it was in fact still broken.

So yeah, as you might have guessed. It automagically started working.

  • 4 hours in the future

A cold beer or maybe…just maybe…a good shot of tequila.

Closing

In the end I can only blame the events on Gremlins, someone clearly feed the SharePoint Mogwai after dark and they had fun wreaking havoc. I can only send thanks to Rambo-Gizmo for eradicating the issue.

What I hate most of today’s event is the numerous posts such as this one by SharePointBabe coming to the solution it’s just quicker to rebuild the service application. My issue here is this is not really an acceptable solution. I wouldn’t have such an issue if Microsoft support didn’t take the same approach…but then I’ve already had my rant for this post.

posted @ Thursday, November 07, 2013 3:09 PM | Feedback (0) | Filed Under [ SharePoint Rants SharePoint 2013 ]

Thursday, December 06, 2012

A Lesson About Notes Restricted Groups

This is an experience (rant) I share for the sake of those who may have something similar in their lives. As of this writing I am leading a Lotus Notes 8.5 to Exchange 2010 migration. Our requirements drove the usage of a co-existence solution offered by companies such as Binary-Tree or Quest. Co-existence is defined as sharing a single mail namespace between Exchange and Lotus Notes. In our case Binary-Tree was chosen, however their co-existence solution fell somewhat short when it came to preserving functionality for migrated users in need of the ability to send mail to restricted mail groups in Lotus Notes. During the user migration process, the Notes person document is updated to facilitate mail routing to and from Exchange. that said one would think a co-existence solution would take the necessary steps to make changes to any permissions for any groups contextual to the migrated user. Apparently one would be very wrong, at least in the case of a Binary-Tree solution. As for the solution, it was as simple as adding the primary SMTP short name (e.g first.last from first.last@domain.com) to the restricted group acl win the Domino(Notes) address book. Once this change is complete, replicate the address book across your domino environment. This seems to clear up those lovely Notes mail router responses of "Not authorized to send mail to this user or group" and most importantly restore user function across the co-existing environment.

posted @ Thursday, December 06, 2012 10:22 AM | Feedback (0) | Filed Under [ Rants Lotus Notes Migration Exchange 2010 Exchange ]

Tuesday, September 11, 2012

An Awesome SharePoint Migration Tool

As I prepare for our next SharePoint user group meeting, I have to take a moment to send out a shout of praise and thanks to the folks at Sharegate.  Thank You for making such a great migration tool, it's as your slogan indicates a "no brianer"

For those of you not familiar with the Sharegate, I suggest you get that way. But first allow me to share some of the back story of how this blog post came to be.

Rewind roughly six weeks ago, we (my co-workers) were in the middle of a SharePoint deployment where we came to the conclusion we would introduce a new requirement which would change our information architecture. The resultant task was to consolidate three web applications into one within the same farm. Upon completion of this "simple" task we would continue our deployment and development activities around our new information architecture. 

Normally I would say no sweat, we'll just move the data by means of export and re-import. Sadly, this wasn't an option since we quickly learned all of our desired data didn't live in the same farm, and to make things more complicated all of the content was rich with managed meta-data. 

Our first try at this was to call on development resources, they gave it their best shot but it was turning into a bigger issue than we first sized up (I'm sure this hasn't happened to anyone else). Long story short we were spinning our wheels only to see the potential of many wasted dollars in the terms of time.

Second try was to bring PowerShell to the rescue, the thought of exporting the Managed Meta-data Service looked promising however we were running into to resource and time constraints. We now had our backs against the wall.

As it turns out, the third time was the charm, I can't recall how we stumbled upon them (probably Google) but we did stumble upon the Sharegate Migration Suite. Brilliant marketing, offer a fully functional trial program that will move five records at a time during the length of the free trial!.... THIS SAVED OUR PROJECT!

We were able to consolidate enough data in 20 minutes to allow our deployment activities to continue. Mind you we had spent one very long week trying to do this on our own. This is one of those moments where I felt very stupid for not starting with Sharegate.

What made this work was it's client based footprint, using web services to gather the data. There wasn't a server install required. That was music to our ears!

In the first use, compared to the cost of a development resource attempting to create a console app, the tool paid for itself.  The console app would have pretty much been a "one and done". I'll concede the fact the core code may have been reusable, to another developer, but not to me as an admin, at least not in the fashion it was going to be designed.

Our subsequent uses have now paid for the product a couple more times in effort when compared to a build your own.  

If you are in a meta-data migration nightmare or just want to empower that power user then please do yourself a favor and evaluate this tool, strongly consider it as a resource. Don't keep this tool for yourself, make this a resource for that power user or help-desk staff.

Just understand this tool is actually just a subscription type of a tool. That said if you are migrating once it should be a "no brainer" to buy it.

You can find Sharegate them at http://en.share-gate.com/ 

 

 

posted @ Tuesday, September 11, 2012 6:45 PM | Feedback (0) | Filed Under [ SharePoint Rants SharePoint 2010 ]

Wednesday, March 14, 2012

AD User Attribute Reference

I know, like the blogging world needs yet another post on AD User Attributes. There are so many out there and countless other message boards however, as I set and listen to Bill Withers sing his classic song "Lean On Me" I can't help but think maybe I should put a fresh post out for those who are new to SharePoint and new to this whole thing called AD.
I know what your thinking, why waist the energy on something this simple. I have only one answer.
Because it must be done. SharePoint loves to consume external data if given the chance.
That said for those who have stuck with me this long, here are the resources I've used in my data mapping exercises.
These are the real fun exercises you get to do when AD hasn’t been kept up and you need to pitch in to help update it so SharePoint UPS can shine through and work its magic of populating the user data from its external source. To not do this would be like leaving money on the table, after all Microsoft (MSFT) gives us Forefront Identity Manager (FIM) lite as part of SharePoint...lets use it!
So now that we have a reference list we can use something like PowerShell to dump a list of AD into excel and begin good old fashion data massage. Of course we want to take the time to make sure this data massage is documented right? After all we can turn this documentation into our roadmap for automation to help our infrastructure counterparts an extra hand on their workload. I mean let's face it, without them we are up a creek, our beloved applications are dependent upon their work. I toast them as the unsung heroes in our professions...(just don't tell anyone)
I think this post lays the foundation for an upcoming example of "How to", which admittedly would be a greater a help than pointing you to the dictionary leaving you saying "hmmm...this might work" :)

posted @ Wednesday, March 14, 2012 7:47 PM | Feedback (0) | Filed Under [ SharePoint ActiveDirectory Rants ]

Thursday, February 16, 2012

Productivity HUB RSS Feed

 

Recently I had the opportunity to demo the SP1 edition of Microsoft's SharePoint Productivity Hub. The environment had CISCO's ScanSafe product doing some web filtering, which was causing my claims based app to return the error of

ProtocolError occured trying to complete the request. The server returned a status code of : ProxyAuthenticationRequired and the status description is : "Proxy Authentication Denied"

After a brief period of using Google for a quick fix I discovered one of two things. Either my search was down the wrong rabbit trail or there wasn’t much in the way of useful quick fix listed for Claims Apps. Giving credit where credit is due the following site got the brain out of the fog. 

From here I figured it was time to break out the MSDN site to figure out how this element works.

After reviewing the MSDN page on the defualtProxy node in the web.config, I came to the next logical step of disabling it, thinking HMMM...This Might Work

So from OLD to NEW, then a quick refresh of the browser and voila!

OLD NEW
<system.net>
    <defaultProxy  />
  </system.net>

    <system.net>
    <defaultProxy enabled="false" />
  </system.net>

The MSDN Page Can Be Found Here.
http://msdn.microsoft.com/en-us/library/kd3cf2ex.aspx

 

 

 

 


posted @ Thursday, February 16, 2012 1:05 PM | Feedback (0) | Filed Under [ SharePoint 2010 ]

Monday, October 18, 2010

Troubleshooting SharePoint Approval Workflow

Recently I was helping someone with a troubleshooting a stock Approval Workflow within a document library. Their request was simple…”Help!” They continued with…”My workflow used to work but all of sudden has stopped notifying people until the task is overdue…not sure when this changed but I need help now.”  If this sounds familiar then read on.

Since we all know SharePoint never just stops (ha ha) I figured I needed to drop down and look at the little things, never know it just might work. That said the task list was enabled to send mail upon creation and long story short I had to hit the forums for advice, after all everything seemed in order, even the little things.

The point of this post is to give props to a very helpful blog post by Steve Chen where SharePoint 2007 alerts are explained in some very good detail and another to a the creators of the SharePoint Manager on codeplex.

As it turns out the solution to the problem here was using the SharePoint manager to set the EnableToAssignEmail flag to “False”, wait for the timer job to run (after we used stsadm to verify job-immediate-alerts were set), and then using the manager tool to reset the EnableToAssignEmail flag back to “True”. One important note here is to make sure you hit save…otherwise it’s not going to write to SharePoint.

 

It would also be wise to remember you are using a tool that can seriously, and rather easily cause some harm to SharePoint if used by someone unfamiliar.

posted @ Monday, October 18, 2010 5:06 PM | Feedback (0) | Filed Under [ SharePoint ]

Powered by: