Hmm...This Might Work

Solutions from a day long since past
posts - 19, comments - 7, trackbacks - 0

Monday, June 30, 2014

New-SelfSignedCertificate and CERT Provider

 

Well the non-whimsical title aside I must say “hats off “ to those PowerShell gurus at Microsoft. You’ve made my live a bit easier.
This quick post is a look at the New-SelfSignedCertificate CMDLET and how the PowerShell Certificate (CERT) Provider.

I realize both are rather self explanatory, the first creating a self signed certificate where as the other providing directory
like interaction / access within PowerShell to the certificate stores. Essentially making the need to spin ye old mcc certificate
console a moot point.

Suppose you’re tasked with building a functional lab {insert Microsoft software title here} environment, naturally you want to
automate as much as possible yet those pesky certificates cause you to break open IIS to create a self signed certificates.
Sure, it’s only an extra manual step or two but my take, why do manual when automation isn’t but half the effort more?

That said what if the requirements are for a SharePoint 2013 LAB, with a functioning APP Model. The app model requirements bring
along with it the requirements for wildcard certificates.  Now I could be missing something but my testing within IIS8 didn’t allow
for specifying the FQDN (CN).

I guess at this point, one could consider a few options. First one might be inclined to stand up a lab PKI(or leverage an existing one)
Of course a more simple but costly route would be to use public certificates.

If time and money are constraints then our friendly neighborhood PowerShell cmdlet and CERT provider can quickly help us out.  
After all the New-SelfSignedCertificate will let us specify our DNS name or DNS names. Yes…That was plural of names, as in more
than one. And since we are talking DNS names, well then we only find ourselves limited by what is defined in DNS or the server
HOSTS file (none of us do that – right?)

So take this snippet and incorporate it into your automated lab builds or conversely offer your own opinion

#Issue A Self Signed Cert

New-SelfSignedCertificate -CertStoreLocation Cert:\LocalMachine\My -DnsName *.subdomain1.subdomain.domain.org, hostname.subdomain1.subdomain.subdomain.org

#Export Self Signed Cert To Temp Location

Get-ChildItem Cert:\LocalMachine\My | Where {$_.Subject -like "*subdomain.domain.org"} | Export-Certificate -Type CERT -FilePath E:\Temp\SelfSign.cert

#Import To TRUSTED ROOT AUTHORITY – This prevents browser Errors

Import-Certificate -FilePath E:\temp\SelfSign.cert -CertStoreLocation Cert:\LocalMachine\Root

#Clean Up Temp

Remove-Item -Path E:\Temp\SelfSign.cert

#Move Certificate From Personal To WebHosting

Get-ChildItem Cert:\LocalMachine\My | Where {$_.Subject -like "*arlpdevapps.arlp.org"} | Move-Item -Destination Cert:\LocalMachine\WebHosting

 

Hope this helps someone.

posted @ Monday, June 30, 2014 4:49 PM | Feedback (0) | Filed Under [ SharePoint 2013 ]

Thursday, May 29, 2014

AD User Account Creation–Script

I’m throwing this script out there for any developers or admins who are seeking a quick SharePoint focused script for user account creation.

In my environment we’re working towards automated unattended install of SharePoint, including account creation. AutoSPInstaller is cool
but it seems too complicated for me. Put another way, if I’m going to spend time learning I’m choosing to learn PowerShell and SharePoint
in more detail.

With that said, here is my script. This script could easily import from a CSV (or other file), read a SharePoint List, or any other means of
input. For the purpose of an example, an array is used.

 

#Make Use Of An Array Just for example. This could easily be a csv but since this was for dev it was easier

#In Case of CSV Column Order would be "SamAccountNAme,FName,LName,Description,Password

#$UserList = Import-CSV -Path <csv path here>

$UserList = @(

@("<SamAccountName>","<FName>","<LName>","<Description>","<PassworD>"),

@("<SamAccountName1>","<FName1>","<LName1>","<Description1>","<PassworD1>"),

@("<SamAccountName2>","<FName2>","<LName2>","<Description2>","<PassworD2>")

)

#Loop Through Each Nested Array

ForEach ($User in $UserList) {

$SamAccountName = $User[0]#(Read-Host -Prompt "Please Enter SamAccountName")

#Check To See If SamAccountName Already Exists, If It Doesnt Create It

If (!(Get-ADUser -Filter {SamAccountNAme -eq $SamAccountName})){

    $OUPath = "OU=<OUName>,OU=<OUName>,DC=<DomainRoot>,DC=<dot suffix>"

    $DomainSuffix = "@<domain.org>"

    $FName = $User[1]

    $LName = $User[2]

    $Description = $User[3]

    $PassWord = ConvertTo-SecureString ($User[4]) -AsPlainText -Force #(Read-Host -Prompt "Enter Account Password" -AsSecureString)

    New-ADUser -Name ($FName+" "+$LName) -SamAccountName $SamAccountName -GivenName $FName -Surname $LName -DisplayName ($FName+" "+$LName) -Path $OUPath -UserPrincipalName ($SamAccountName+$domainsuffix) -Description $Description -AccountPassword $PassWord  -PasswordNeverExpires:$true -Enabled:$true

    }

Else

    {

    Write-Host "$SamAccountName already exists within AD. It will not be created"

    }

}

 

posted @ Thursday, May 29, 2014 4:18 PM | Feedback (0) | Filed Under [ SharePoint ActiveDirectory SharePoint 2010 Powershell SharePoint 2013 ]

Disable LoopBack Script - SharePoint

A friend of mine and I were discussing the topic of Disabling LoopBack and which is the better route to go when it comes to creating a new SharePoint 2013 farm.

My perspective, when possible, abide by Microsoft’s recommendations. In this case Microsoft seems to take the classic “it depends” stance on disabling ye old loopback.

Being me, I asked my self why not have both? After all I’m sure PowerShell could help out here. To those who know me or have read this blog; I’m not a developer and

it probably shows in various scripts. That said, the following script does accomplish the desired result of one script allowing you to choose if you are going to take the

“Developer” route and disable the loopback, or take the “Admin” route and call out your exceptions. If you are a new SharePoint Admin this might be useful, if you arent

new to SharePoint then I’m sure you’ve already clicked off of this post a few lines back…lol

#Disable LoopBack Or Enter Back Connection Name

#Use this to avoid disabling loopback - http://support.microsoft.com/kb/896861

#Define in CSV format your FDQNs

$DisableLoopBack = $null

Do {$DisableLoopBack = (Read-Host "Would you like to disable loopback? (Yes / No)") }

Until ($DisableLoopBack -eq "Yes" -or $DisableLoopBack -eq "No")

If ($DisableLoopBack -eq "Yes"){

Write-Warning "According To Microsoft You Should NOT disable loopback, HOWEVER it's a common development practice."

New-ItemProperty HKLM:\System\CurrentControlSet\Control\Lsa -Name "DisableLoopbackCheck" -value "1" -PropertyType dword -Force | Out-Null

IISReset /noforce

}

Elseif ($DisableLoopBack -eq "No"){

$HostNames = @()

$HostNames+=($env:COMPUTERNAME+"."+$env:USERDNSDOMAIN)

$More = $null

Do {

Write-Host "Here are the host names that will be added to the BackConnectionHostNames Exception List"

Write-Host $HostNames -ForegroundColor Green

$More = (Read-Host "Would you like to add others? (Yes / No)")

If ($More -eq "Yes"){

$AddHost = (Read-Host "Enter FQDN Host Name")

$HostNames+=$AddHost

Write-Host "$AddHost has been added to the list of names above" -ForegroundColor Green

$More = (Read-Host "Would you like to add others? (Yes / No)")

}

}

Until ($More -eq "No")

New-ItemProperty HKLM:\SYSTEM\CurrentControlSet\Control\Lsa\MSV1_0 -Name "BackConnectionHostNames" -Value $HostNames -PropertyType MultiString

IISReset /noforce

}

posted @ Thursday, May 29, 2014 4:05 PM | Feedback (0) | Filed Under [ SharePoint 2010 Powershell SharePoint 2013 ]

Thursday, January 2, 2014

Uploading Files to SharePoint using PowerShell

Today I had to fall in line and do something I’m not entirely proud of. I had to create a script to replicate files from a fileshare to Sharepoint. Struggling to find value in the efforts, I figured a blog post to remind me of this event was fitting. The script was rather basic, something good, and I took the opportunity to grow, exploring the world of Powershell – SharePoint interaction outside of the SharePoint PS SnapIn.

Surprisingly, it was a bit easier than I had imagined. While I wish I could take 100% of the credit, inspiration for this function comes from this article

Function Upload-SPFile {

Param (

#Local or Network Path

[parameter(Mandatory=$true)][string]$UncPath,

#SharePoint URL including folder

#https://baseurl/serverrelativeurl

[parameter(Mandatory=$true)][string]$SPURL

)

$UploadPath = $SPURL + $(split-path -Leaf $UncPath)

$WebClient = New-Object System.Net.WebClient

$WebClient.Credentials = [System.Net.CredentialCache]::DefaultCredentials

$WebClient.UploadFile($UploadPath,"Put",$UncPath)

}

So that’s the basic of uploading a file to SharePoint using the .net webclient…to many this is old hat…To me something new to start out the new year.

posted @ Thursday, January 2, 2014 3:44 PM | Feedback (0) | Filed Under [ SharePoint SharePoint 2010 Powershell SharePoint 2013 ]

Wednesday, November 13, 2013

Using PowerShell to Check AD Schema

Here we are, a cold crisp 20 degree Wednesday in November. I thought to myself…this is not cool (no pun), but you know what is cool?  Yeah, I’m sure you guessed PowerShell’s ActiveDirectory module.

Just a quick blog note to show how PowerShell quickly settled a dispute during an upgrade of our AD schema to handle a Windows 2012 DC. Of course this wasn’t a big dispute, many other tools could have been used. The question was had the Schema already been changed to support a 2012 Server. Again, there are many tools that could provide the answer, but what made this so cool was being able to share the experience with others who didn’t know PowerShell could replace some of the old stand by AD tools. So this is more of an AH-HA moment that felt right to share (and the script)…All brought to us by PowerShell and the ActiveDirectory module.

(An academic honesty note here…this script is not 100% my own work...More like 5% – 10% my work, I can’t remember where I snagged the meat of this script so the credit remains unknown.)

#This script will query AD for the Schema Version of AD,Exchange and Lync. Can be ran as least privilaged user.

Import-Module ActiveDirectory

#Array

$SchemaVersions = @()

#AD Portion

$SchemaHashAD = @{

13="Windows 2000 Server";

30="Windows Server 2003";

31="Windows Server 2003 R2";

44="Windows Server 2008";

47="Windows Server 2008 R2";

56="Windows Server 2012"

}

$SchemaPartition = (Get-ADRootDSE).NamingContexts | Where-Object {$_ -like "*Schema*"}

$SchemaVersionAD = (Get-ADObject $SchemaPartition -Property *).objectVersion

$AdSchema = New-Object System.Object

$AdSchema | Add-Member -Type NoteProperty -Name Schema -Value $SchemaVersionAD

$AdSchema | Add-Member -Type NoteProperty -Name Product -Value "AD"

$AdSchema | Add-Member -Type NoteProperty -Name Version -Value $SchemaHashAD.Item($SchemaVersionAD)

$SchemaVersions += $AdSchema

#Exchange Portion

$SchemaHashExchange = @{

4397="Exchange Server 2000 RTM";

4406="Exchange Server 2000 SP3";

6870="Exchange Server 2003 RTM";

6936="Exchange Server 2003 SP3";

10628="Exchange Server 2007 RTM";

10637="Exchange Server 2007 RTM";

11116="Exchange 2007 SP1";

14622="Exchange 2007 SP2 or Exchange 2010 RTM";

14726="Exchange 2010 SP1";

14732="Exchange 2010 SP2";

15137="Exchange 2013"

}

$SchemaPathExchange = "CN=ms-Exch-Schema-Version-Pt,$SchemaPartition"

If (Test-Path "AD:$SchemaPathExchange") {

$SchemaVersionExchange = (Get-ADObject $SchemaPathExchange -Property rangeUpper).rangeUpper

}

Else {

$ExchangeErr = 1

}

$ExchSchema = New-Object System.Object

$ExchSchema | Add-Member -Type NoteProperty -Name Schema -Value $SchemaVersionExchange

$ExchSchema | Add-Member -Type NoteProperty -Name Product -Value "Exchange"

$ExchSchema | Add-Member -Type NoteProperty -Name Version -Value $SchemaHashExchange.Item($SchemaVersionExchange)

If ($ExchSchema.Schema -ne 0) {

$SchemaVersions += $ExchSchema

}

#Lync Portion

$SchemaHashLync = @{

1006="LCS 2005";

1007="OCS 2007 R1";

1008="OCS 2007 R2";

1100="Lync Server 2010";

1150="Lync Server 2013"

}

$SchemaPathLync = "CN=ms-RTC-SIP-SchemaVersion,$SchemaPartition"

If (Test-Path "AD:$SchemaPathLync") {

$SchemaVersionLync = (Get-ADObject $SchemaPathLync -Property rangeUpper).rangeUpper

}

Else {

$LyncErr = 1

}

$LyncSchema = New-Object System.Object

$LyncSchema | Add-Member -Type NoteProperty -Name Schema -Value $SchemaVersionLync

$LyncSchema | Add-Member -Type NoteProperty -Name Product -Value "Lync"

$LyncSchema | Add-Member -Type NoteProperty -Name Version -Value $SchemaHashLync.Item($SchemaVersionLync)

If ($LyncSchema.Schema -ne 0){

$SchemaVersions += $LyncSchema

}

#OutPut Section

Write-Host "Known current schema version of products:"

$SchemaVersions | Format-Table * -AutoSize

#I think this error handling is probably better off in the setting of the note property but this takes care of it for now

If ($LyncErr -eq 1){

Write-Host "Lync or OCS not present" -ForegroundColor Yellow

}

If ($ExchangeErr -eq 1){

Write-Host "Exchange not present" -ForegroundColor Yellow

}

#---------------------------------------------------------------------------><>

So there you have it, another way PowerShell rocks.

posted @ Wednesday, November 13, 2013 10:22 AM | Feedback (0) | Filed Under [ ActiveDirectory Powershell ]

Tuesday, November 12, 2013

Hiding Disabled Users From Exchange Address Book

The other day while reviewing an Exchange 2010 Environment, I noticed a few active mailboxes belonging to disabled users. For obvious reasons this isn’t a good thing, if for nothing else it clutters up the Exchange Address Book.

Next thought in my mind…So what’s the best way to hide these disabled users? Having the PowerShell bias that I do in fact have, I had to spend 15 minutes reviewing the options.

  1. Use a manual process. This would include disabling the user in AD, followed up with the steps described here.
  2. Use Exchange Address Book Policies(ABP). As indicated in this article, APB’s have a dependency on Exchange 2010 SP2. That said it seems like a viable and interesting approach.
  3. Use PowerShell. As I started from the outset, I’m biased right now…So a PowerShell only approach seems “more better”.

Here is the script I used in a resource / user environment. Keep in mind this is a down and dirty version, a proof of concept. I would limit the use of this example as an inspiration only. (good or bad)

#This script will query for all LinkedMailboxes when ran on an Exchange Server

#It will return a user set who show their Linked Master Accounts as disabled

#Use the results with "Set-Mailbox -HiddenFromAddressListsEnabled $true" to change

#all of the disabled users to hidden from the address book. Example Below

add-pssnapin Microsoft.Exchange.Management.PowerShell.E2010 -ErrorAction Continue

Import-Module ActiveDirectory

$linkmbx = get-mailbox -RecipientTypeDetails LinkedMailbox

$alcusers = Get-Aduser -Filter * -Server <your domain here> -Properties Enabled

$userrpt = @()

foreach ($mbx in $linkmbx){

$name = $mbx.linkedmasteraccount

$user = $name.split("\")

$alcuser = $alcusers | where {$_.samaccountname -eq $user[1]}

if ($alcuser.Enabled -eq $false){

$rpt = New-Object System.Object

$rpt | Add-Member -MemberType NoteProperty -Name Name -Value $alcuser.Name

$rpt | Add-Member -MemberType NoteProperty -Name Alias -Value $mbx.alias

$rpt | Add-Member -MemberType NoteProperty -Name HidFromAddBook -Value $mbx.HiddenFromAddressListsEnabled

$userrpt += $rpt

$rpt

}

}

Write-Host "There are" $userrpt.count "linked mailboxes with disabled user accounts in user domain"

<#

#Uncomment this section if you want to include changing the address book visability

Foreach ($user in $usrrpt){

Write-Host "Changing address book visability for" $user.alias

Set-Mailbox -Identity $user.alias -HiddenFromAddressListsEnabled $true

}

#>

Of course the next thought of automation comes to mind…but that’s a different post. 

posted @ Tuesday, November 12, 2013 3:15 PM | Feedback (0) | Filed Under [ Exchange 2010 Powershell ]

Thursday, November 7, 2013

Create SharePoint 2013 Result Source with PowerShell

In my continued automation efforts, I was looking to convert documentation provided from a consultant into something more…well…automated. In this first of four parts I’ll list out creating a Result Source with Powershell.

Subsequent posts (part 2 – 4) will give example of creating Result Types, Query Rules, and Search Navigation. The aim of this effort is to make use of PowerShell in rebuilding, essentially cloning without data, a search service application. This is useful when Microsoft support gives the classic solution of “rebuild” your service application. Doh!

It should be noted:

  • This will create the Result Source at the Site Collection level.
  • This isn't 100% my original work, it’s inspired (taken mostly and modified) from the SearchGuys blog post.
    • The blog had Bing and Federation as an example, this example is a local SharePoint Result Set to Query BCS

 

Add-PSSnapin Microsoft.SharePoint.PowerShell

#Change These Variables To Fit

$SPWeb = "Your SP Site Collection Here"

$resultSourceName = "Your Content Source Friendly Name Here"

$resultSourceDescription = "Description for (BCS) Data Source"

$qT = '{searchTerms?} (ContentSource="<Content Source Friendly Name Here>" IsContainer=false)'

#Begin The Process

$ssa = Get-SPEnterpriseSearchServiceApplication

$fedman = New-Object Microsoft.Office.Server.Search.Administration.Query.FederationManager($ssa)

$searchOwner = Get-SPEnterpriseSearchOwner -SPWeb $SPWeb -Level SPSite

$resultSource = $fedman.GetSourceByName($resultSourceName, $searchOwner)

#Check To See if it exists

if(!$resultSource){

Write-Host "Result source does not exist. Creating."

$resultSource = $fedman.CreateSource($searchOwner)

}

else { Write-Host "Using existing result source." }

#Finish It Up

$resultSource.Name =$resultSourceName

$resultSource.ProviderId = $fedman.ListProviders()['Local SharePoint Provider'].Id

$resultSource.Description = $resultSourceDescription

$resultSource.CreateQueryTransform($qT)

$resultSource.Commit()

posted @ Thursday, November 7, 2013 7:55 PM | Feedback (0) |

Working with SharePoint Web Parts using PowerShell

First things first. I’m not a developer. I seem to do ok working my way through the SharePoint object model with PowerShell and writing automation scripts, but that doesn’t make me a developer.

With that out the way, I do find myself in an odd place with developers neglecting (for whatever reason) to automate population of web parts in the content pages their solution has deployed. An example might be Search Content web part needing to have the proper display template selected for displaying conversations. Ah…My Dev friends (don’t hate me) but why not go that extra mile?  If I can do it through script, surely you can employ your superior coding skills to include it in the solution (wsp)!

For those of you who may find yourself in my shoes, here is a script I created to help ease that cross farm (environment pain). Essentially the script is a function with a few parameters. This is the core, from here you can customize to your needs. It’s a great starting point for anyone who wants to automate changes to web parts via PowerShell.  You can copy and paste the script below into PowerShell ISE and save it to whatever name you like

 

Add-PSSnapin Microsoft.SharePoint.Powershell

#----Start of Function-------

Function Set-WebParts {

[CmdLetBinding()]

Param(

[Parameter(Mandatory=$True,Position=0)]

[String]$SiteUrl = $(Read-Host "Please Enter Site URL"),

[Parameter(Mandatory=$True,Position=1)]

[String]$PageUrl = $(Read-Host "Please Enter Page URL")

)

$web = Get-SPWeb $SiteUrl

#+Get and Checkout Page

#+-Get Page

$page = $web.GetFile($pageURL)

#+-CheckOut The Page

$page.CheckOut()

#+-Load Limited Web Part Manager

$wpm = $web.GetLimitedWebPartManager($pageURL, [System.Web.UI.WebControls.WebParts.PersonalizationScope]::Shared)

#+Change Conversations WebPart - In my example I had some webparts titled "Conversations" and some Titled "Content Search"

$wp = $wpm.WebParts | Where {$_.Title -eq "Conversations" -or $_.Title -eq "Content Search"}

#+-This is the base url for the template - Change this to whatever meets the need

$base = "~sitecollection/_catalogs/masterpage/Display Templates/Content Web Parts/"

#+-This is the template name

$template = "Item_Discussion.js"

$NewItemTemplateId = $base+$template

#+-Actually Setting The Part

$wp.ItemTemplateId = $NewItemTemplateId

$wpm.SaveChanges($wp)

#+CheckIn and Publish Page

$page.CheckIn("Scripted Change")

$page.Publish("Scripted Publish")

$web.Dispose()

}

#----End Of Function----

SetWebParts.ps1 

As I said, I’m not a developer so while functional, there is probably a better way to accomplish what I’ve published here.

posted @ Thursday, November 7, 2013 5:31 PM | Feedback (0) | Filed Under [ SharePoint Rants Powershell SharePoint 2013 ]

SharePoint 2013 Managed Metadata Service Application (MMSA) Gremlins

 

This post objective: To simply document something I can’t explain.

First the environment has a single WFE and two different app servers (APP1 and APP2). The SharePoint environment in question is running 2013 RTM bits (I know)…

The Timeline
  • Roughly 24 hours ago

An unplanned deployment of a custom farm solution. Mostly just an automated deployment of content pages with custom webparts. Nothing out of the ordinary here.

The standard post deploy testing revealed nothing out of the ordinary, other than Search and SSRS not playing well with each other in this production farm.

<rant>It seems the Microsoft support solution is to recreate the Search Service App, then SSRS will play nice in the logs. I’ll tell you this works but not what I’d call acceptable…</rant> 

  • 10 hours ago

Notification of Managed Metadata navigation malfunction by user

  • 6 hours ago

Start of troubleshooting MMSA. Service application interface in CA had error indicating “The Managed Metadata Service or Connection is currently not available. The Application Pool or Managed Metadata Web Service may not have been started. Please Contact your Administrator.”  Naturally I figured there was a stopped application pool, which there was, it just wasn’t one running this service.

Next I tried to open the service connection properties, only to get this error (from ULS).  Application error when access /_admin/ManageMetadataProxy.aspx, Error=Retrieving the COM class factory for component with CLSID {BDEADF26-C265-11D0-BCED-00A0C90AB50F} failed due to the following error: 800703fa Illegal operation attempted on a registry key that has been marked for deletion. (Exception from HRESULT: 0x800703FA). 

At this point I’m at a loss an figure I’ll try to restart the services via CA. Just for good measure I started the service on each server (WFE,APP1,APP2). Same results, nothing changed

Read a blog suggesting an unlikely event of the application pool need access to the service application. It worked well prior to this event, but for good measure let’s add it in.   Same results, nothing changed

  • 3 hours ago

Resigned to throw the hail marry of an IISReset, just to see if it will commit anything changed to this point. Sent notification to enterprise giving heads up of unplanned reset.

  • 2 hours ago

Getting Ready to go for lunch, figured I take a quick look before I throw the switch on the IISReset. Before checking the MMSA, I ran Get-CacheClusterHealth only to get an error “No valid cluster settings were provided with Use-CacheCluster”.  Not a big deal, anticipated this so I ran Use-CacheCluster then Get-CacheClusterHealth once more. This time I received the expected Cluster health statistics. Getting somewhat anxious to make some headway I figured I’d flip back over to the MMSA to make sure it was in fact still broken.

So yeah, as you might have guessed. It automagically started working.

  • 4 hours in the future

A cold beer or maybe…just maybe…a good shot of tequila.

Closing

In the end I can only blame the events on Gremlins, someone clearly feed the SharePoint Mogwai after dark and they had fun wreaking havoc. I can only send thanks to Rambo-Gizmo for eradicating the issue.

What I hate most of today’s event is the numerous posts such as this one by SharePointBabe coming to the solution it’s just quicker to rebuild the service application. My issue here is this is not really an acceptable solution. I wouldn’t have such an issue if Microsoft support didn’t take the same approach…but then I’ve already had my rant for this post.

posted @ Thursday, November 7, 2013 3:09 PM | Feedback (0) | Filed Under [ SharePoint Rants SharePoint 2013 ]

Thursday, December 6, 2012

A Lesson About Notes Restricted Groups

This is an experience (rant) I share for the sake of those who may have something similar in their lives. As of this writing I am leading a Lotus Notes 8.5 to Exchange 2010 migration. Our requirements drove the usage of a co-existence solution offered by companies such as Binary-Tree or Quest. Co-existence is defined as sharing a single mail namespace between Exchange and Lotus Notes. In our case Binary-Tree was chosen, however their co-existence solution fell somewhat short when it came to preserving functionality for migrated users in need of the ability to send mail to restricted mail groups in Lotus Notes. During the user migration process, the Notes person document is updated to facilitate mail routing to and from Exchange. that said one would think a co-existence solution would take the necessary steps to make changes to any permissions for any groups contextual to the migrated user. Apparently one would be very wrong, at least in the case of a Binary-Tree solution. As for the solution, it was as simple as adding the primary SMTP short name (e.g first.last from first.last@domain.com) to the restricted group acl win the Domino(Notes) address book. Once this change is complete, replicate the address book across your domino environment. This seems to clear up those lovely Notes mail router responses of "Not authorized to send mail to this user or group" and most importantly restore user function across the co-existing environment.

posted @ Thursday, December 6, 2012 10:22 AM | Feedback (0) | Filed Under [ Rants Lotus Notes Migration Exchange 2010 Exchange ]

Powered by: