Your basic ITPro blog... What's going on at work, what I'm interested in.

Monday, March 31, 2008

Virtualizing Using VMM

I have been looking in to virtualizing XP machines using SCVMM. This tool has worked great for us on various W2003 servers. I wanted to try an XP box (namely, my laptop) for a couple of reasons. First, I wanted to see if it could be done. The docs say 'yes', but I wanted to test it. Second, I thought having a VM of my laptop standing by would be pretty nice in case my laptop HD dies. Third, I have Virtual PC running on my laptop and I wanted to see if it would run in a VM of my laptop... a VM hosting VMs.

While working through this process, I came up against various issues and errors that I had to resolve. I wanted to list them here, mainly so I could reference this list in the future when I forget this stuff!

The following services must be running on the source computer...

  • BITS
  • VMM Agent
  • WMI
  • WS-Management

Also, you need to make sure that nothing else is using port 443. It looks like BITS uses 443 to transfer data from the source machine to the VM host. I had IIS running on my laptop and it had 443 tied up. I just stopped my Default Website and everything was fine.

The last requirement I ran in to (and this may seem obvious) was that the VM Host I was deploying to had to have adequate resources available for the VM. This is true even if you don't plan on turning the VM on. Initially, my laptop would have put my Host over the top and my conversion failed. I pulled 2GB of RAM out of my laptop and run things again, with no problems.

A Recommendation! Try to have GB Ethernet all the way through for the data transfer. My laptop HD has around 70GB of data on it. Over a 100MG link... well... this was going to take longer than I wanted to wait.

As I have been working through this process, I have been considering how we might use this technology to our advantage. It would be nice to have a VM backup of my machine. I have read articles by people who do this for their servers. The make a VM weekly and keep it as a stop-gap DR solution. If their physical box blows up, they can turn up the VM and run on it while they get the physical server fixed up.

For workstations, this would be much more beneficial if there were also a V2P (virtual to physical) conversion process. A Google search for 'V2P' brings up numerous hits, so I don't think I am the first to want this! My initial thought would be to turn up the VM, Ghost it, and then blow the Ghost image on to a physical box. I will see if my research reveals a better option.

2008-03-31 PowerShell Exercise

I just have to share this one. I know PowerShell isn't the be-all and end-all when it comes to Windows Management, but there are definitely times when it is the best tool. This is one of those times.

QUESTION:

Hey, Scripting Guy! How can I get a list of all the computers in my domain, grouping those computers by the OU in which the computer account resides?
-- TA

SCRIPTING GUYS' SOLUTION:

"Interestingly enough, this is a trickier problem than you might expect it to be. At first glance you might think, “Well, that should be easy enough: I just get back a list of all the computers and the OUs they belong to, then sort that list by OU.” That’s actually a pretty good idea, except for one thing: there is no Active Directory property that tells you which OU an object resides in. (No really; take a peek at this column for information on the roundabout method you need to use to determine the OU an object belongs to.)

"But that’s OK. The logical way to attack this problem is to retrieve a list of all the computers and then determine which OU they belong to. In this case, however, the logical way doesn’t do us any good. Therefore, we need to do things the illogical (i.e., the Scripting Guys) way; we’ll get back a list of all the OUs, and then determine which computer accounts can be found in each OU.

"In other words, we’re going to use the following script:

On Error Resume Next Const ADS_SCOPE_SUBTREE = 2 Set objConnection = CreateObject("ADODB.Connection") Set objCommand = CreateObject("ADODB.Command") objConnection.Provider = "ADsDSOObject" objConnection.Open "Active Directory Provider" Set objCommand.ActiveConnection = objConnection objCommand.Properties("Page Size") = 1000 objCommand.Properties("Searchscope") = ADS_SCOPE_SUBTREE objCommand.CommandText = _ "SELECT ADsPath FROM 'LDAP://dc=fabrikam,dc=com' WHERE " & _ "objectCategory='organizationalUnit'" Set objRecordSet = objCommand.Execute objRecordSet.MoveFirst Do Until objRecordSet.EOF Set objOU = GetObject(objRecordSet.Fields("ADsPath").Value) Wscript.Echo objOU.distinguishedName objOU.Filter = Array("Computer") For Each objItem in objOU Wscript.Echo " " & objItem.CN Next Wscript.Echo Wscript.Echo objRecordSet.MoveNext Loop

POWERSHELL SOLUTION:

Get-QADComputer | Sort-Object -Property ParentContainer | Select-Object Name, ParentContainer

Sunday, March 30, 2008

Biting the Bullet...

I admit it... I have been a hold-out when it comes to Microsoft's new software. This is true of Office 2007 and, especially, Windows Vista. Unfortunately, they are both coming, whether I like it or not.

So, as a small sign of flexibility (and resignation of the inevitable), I installed Office 2007 this evening. Not only that, but I used the 'Upgrade' option. I was going to install it side by side with Office 2003, but I didn't want to chew up the bits on my HD. I am running low as it is.

I have been resistant to Office2007 and Vista from Day 1. The main reason for this is that I was not able to come up with a good answer to the following question:

"What does the current version of this software NOT do that I really need done?"

This question is even more compelling when considering the XP/Vista question. XP is a good OS... mature, stable, compatible, well-known. Vista is, from what I have seen and read, none of these things (though I have heard positive things about SP1). I plan to stay on XP as long as possible, unless my boss decides to get me a new laptop and it comes with Vista. ;)

Office 2003 also does everything I need, and much more. In fact, as I was considering things, I came to the realization that I only really use Outlook and a bit of Visio. It is very rare that I will pop open Excel, Word, Access, or Publisher. On occasion I will use these apps to view a document. But it is a truly rare event for me to open these apps to a new document and create something of my own. I just don't really have the need to do this very often.

But, the world around me is changing... and, as the saying goes, "Lead, follow, or get out of the way!" Well, getting out of the way is not an option (I've tried!). It's too late for me to lead; there are already many others at work with Office07, and more coming. So, my only real option was to follow... If I wanted to be compatible with, and knowledgeable of, the computing platform that is rapidly becoming the standard on our campus, I had to upgrade.

As you may guess, I have only opened Outlook so far. Looks nice.

Wednesday, March 26, 2008

User Account Management

We had a very practical problem that we needed to solve today.

In the past, we had used the properties in the 'Profile' tab of AD user accounts to manage login scripts and home folders. We are now, however, using group policies to manage this stuff. We have been 'cleaning' our user accounts over time, removing these settings. But, the task was not yet done. And, we didn't know which account had been processed and which hadn't.

This morning, we decided to tackle this problem once and for all. At first, we were just going to sit down and go through our accounts, one by one, and delete the logon scripts and home folders settings. This, as you can imagine, would have taken a while. So, we thought, let's see if PowerShell can help us out... Of course it can!

The first thing I did was go in to my test environment and create these settings in a user account. Then, using the Quest cmdlets, I checked to see if these properties were available. The problem was, these properties are not, by default, exposed. That is, if I Get-QADUser to variable $user and then type '$user.' tab completion does not show these properties.

And here is where I learned something today...

Typing 'Get-QADUser -IncludeAllProperties -ReturnPropertyNamesOnly' gives you a list of all the properties available, not just the default properties that the cmdlet wants you to see. In this expended list you see, among many others, "scriptPath" and "homeDirectory".

With this information, finding and changing these property settings became as easy as a couple of one-liners... specifically:

Get-QADUser -ObjectAttributes @{scriptPath="netuse.bat"} | ForEach-Object {Set-QADUser $_.DN -ObjectAttributes @{scriptPath=''}}

and

Get-QADUser -ObjectAttributes @{homeDirectory="*"} | ForEach-Object {Set-QADUser $_.DN -ObjectAttributes @{homeDirectory=''}}

There were 30-40 accounts with logon scripts defined and around 25 accounts with home directories. These were spread among over 350 user accounts total! I think that researching and finding this scripted way to doing this was a good investment to time and energy.

Sunday, March 23, 2008

PowerShell Posts...

I am not going to be able to keep these as a daily post. This will likely be to the relief of some... It certainly will be to me. Some days, it only takes a minute, some days my solution is little more than a transliteration of the VB version. If I run across a particularly cool example of how PowerShell makes a script much easier, like this, I will post it.

Microsoft is committed to PowerShell as a technological base for their products. As a Microsoft ITPro, I want/need to learn as much as I can about it. But, I am going to shift the focus of this blog, just a bit, to be more about work-specific items.

Friday, March 21, 2008

Virtualizing a File Server

We are in the middle of a project to virtualize our primary file server. As you can imagine, this is a large project for us. We had a lot to consider and plan out before moving forward. All in all, it has been an eye-opening endeavor.

One of the biggest challenges was deciding on how the new implementation would look. We knew we wanted to use a SAN for storage, rather than the DAS we have now. After considering multiple solutions, we went with a Dell MD3000i. This iSCSI SAN solution is a pretty solid platform, though not perfect (what technology is?!) Along with the MD3000i, we also got a MD1000. We are using a combination of SAS and SATA disks, depending on our intended use of the storage.

Our next decision was regarding the server itself. Did we want to use a physical box, or a virtual machine? We decided on a VM. I liked to idea of easy portability and hardware independence. We currently use Microsoft Virtual Server and will be moving to Hyper-V later this year. Our VMs will move to the new platform with no problems. And, because we are storing all of the VM files on a SAN, moving VMs is as easy as re-mapping LUNs to new physical boxes.

So, we had our decisions on storage and server. As I got in to using the MD3000i, I found that things weren't going to be quite as smooth as I was hoping (ignorance really is bliss!).  For one thing, LUNs larger than 2TB are not possible, which means that VHDs larger than 2TB are not really possible (without some hacking). Also, with our platform, there isn't really a way to get a VM to directly use disk storage. We have to use VHDs. But, I am getting ahead of myself...

We spent considerable time on how best to provide multiple TBs of storage from the SAN to our VM file server. The items listed below detail things:

CURRENT ENVIRONMENT:

One physical server with 1TB of DAS. Most users have a personal folder mapped to a drive letter and a shared folder mapped to a drive letter. The security in the shared folder is not exactly coherent or structured. It is pretty much a free-for-all. Users can create folders anywhere and most users can access more information that they need. ACLs are also a bit unwieldy, using (mostly) user accounts rather than groups. Finally, we were out of space; our 1TB was full.

NEW ENVIRONMENT:

One virtual server with 2TB disk for shared storage. I created a 2TB LUN on the SAN and mapped it to the VM host server. I then created a 2TB VHD file on this disk and attached that VHD to the file server VM. So, the file server VM has two disks; a boot disk (on a separate LUN) and a data disk. When we need to add storage to the server, it is as simple as creating another LUN and another VHD. Users (now all users) are still getting a personal storage drive and a shared storage drive.

RESULTS:

The biggest difference (from the users' perspective) is the shared storage. My related entries here and here detail what we are doing. I have moved a portion of our shared data to the new server and users have really liked the new solution. They most appreciate how 'clean' the shared drive is now (thank you, ABE!) Also, departments such as HR and Accounting were a bit startled by the fact that access to their files wasn't quite as limited as they thought. They like the fact that now people can't even see, let alone access, their files. So, this new implementation has been a big win as far as users are concerned.

On the management side of things, life will be much simpler as well. Each top-level folder will only have four ACL entries. We will be able to know which folders a user has access to simply by looking a group membership. New folder creation is a snap with the PowerShell script I wrote. Things are organized, clean, structured, and (most importantly) known. The system is not largely self-documenting.

We currently use BackupExec for backups and have put an agent on the VM. So we back up things just as you would with a physical box. But, we are going to be implementing System Center Data Protection Manager later this year. This should give us a more robust D2D2T backup solution for our VM environment.

2008-03-21 PowerShell Exercise

The first thing you will probably notice is that I did not use regular expressions. There were a couple of reasons for this. One, it just seemed easier to trim the 'license' string, cutting off the ")," which is present in every record. This seemed easier to me than using regular expressions because, is at turns out, I don't really know how to use them... something I need to learn.

The first few lines of my script actually build the text file used as input. I could have done this in notepad or whatever, but thought it I would do it in the script itself...

I hope you enjoy these, or at least find them a bit interesting.

QUESTION:

Hey, Scripting Guy! I'm trying to automate a tedious license server task. To do this, I need to loop through a text file line-by-line, grab out a user name, asset number, and license ID, and then pass those values to a batch file. How can I do all that?

Sample Data:
Username1 asset1 asset1 (v24) (licserver/1000 1111), start Wed 3/19 8:50
Username2 asset2 asset2 (v24) (licserver/1000 1112), start Wed 3/19 8:55
Username3 asset3 asset3 (v24) (licserver/1000 1113), start Wed 3/19 8:59

SCRIPTING GUYS SOLUTION:

Const ForReading = 1 Set objShell = CreateObject("Wscript.Shell") Set objRegEx = CreateObject("VBScript.RegExp") objRegEx.Pattern = "\d{1,}" Set objFSO = CreateObject("Scripting.FileSystemObject") Set objFile = objFSO.OpenTextFile("C:\Scripts\Test.txt", ForReading) Do Until objFile.AtEndOfStream strLine = objFile.ReadLine arrItems = Split(strLine, " ") strUserName = arrItems(0) strAsset = arrItems(1) strLicenseID = arrItems(5) Set colMatches = objRegEx.Execute(strLicenseID) strLicense = colMatches(0).Value strCommand = "%compsec% /c C:\Scripts\Test.cmd " & strUserName & " " & strAsset & " " & strLicense objShell.Run strCommand, 1, True Loop objFile.Close

MY SOLUTION:

# Create test data file New-Item -Name "test.txt" -ItemType "File" -Force -Value ` "Username1 asset1 asset1 (v24) (licserver/1000 1111), start Wed 3/19 8:50 Username2 asset2 asset2 (v24) (licserver/1000 1112), start Wed 3/19 8:55 Username3 asset3 asset3 (v24) (licserver/1000 1113), start Wed 3/19 8:59" #Process the records $records = Get-Content "test.txt" foreach ($record in $records) { $arrElements = $record.Split(" ") $userName = $arrElements[0] $asset = $arrElements[1] $license = $arrElements[5].Substring(0,$arrElements[5].length - 2) Invoke-Item ".\test.cmd $userName $asset $license" }

Thursday, March 20, 2008

2008-03-20 PowerShell Exercise

So, this one is a cool example of how PowerShell integrates and uses WMI. Now, I just need to learn WMI!  :-)

QUESTION:

Hey, Scripting Guy! I’ve looked through the entire Script Center but I can’t find an example of what I need to do: find all the .DOC files on drives C and D, but using a single WMI query. In other words, I want to write a WMI query similar to this: Where ( Drive = 'C:' or 'D:' ) and ( Extension = 'doc' ); unfortunately, though, I can’t figure out the correct syntax. Is there a way to mix AND and OR clauses in a single query?

SCRIPTING GUYS SOLUTION:

strComputer = "."
Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\cimv2")
Set colFiles = objWMIService.ExecQuery _
    ("Select * From CIM_DataFile Where (Drive = 'C:' OR Drive = 'D:') AND Extension = 'doc'") 
For Each objFile In colFiles 
    Wscript.Echo objFile.Name 
Next

MY POWERSHELL SOLUTION:

$computerName = "." 
$wmiQuery = "Select * from CIM_DataFile Where (Drive = 'G:' OR Drive = 'U:') AND Extension = 'doc'" 
Get-WmiObject -Query $wmiQuery -ComputerName $computerName

Or, as a one-liner...

Get-WmiObject -Query "Select * from CIM_DataFile Where (Drive = 'G:' OR Drive = 'U:') AND Extension = 'doc'" -ComputerName "."
# You can leave of the -ComputerName parameter altogether if you want

Wednesday, March 19, 2008

Formatting Suggestions

Does anyone have any suggestions on how I can easily format code on this blog? I am guessing I am add a style or something, but not sure how/what to do in BlogSpot... It is a royal pain publishing code here!

2008-03-19 PowerShell Exercise

Today's script is a very useful one. It is always frustrating to see groups as members of groups without being able to easily see group membership.

THE QUESTION:

Hey, Scripting Guy! I manage over 300 servers in our environment. For each server I need to determine the members of the local Administrators account. It’s easy to get a list of local users and domain users that belong to the Admin account; it’s also easy to get a list of any domain groups that belong to the Administrators account. However, what I’d really like to do is take each of those domain groups and then get a list of their members. In other words, I don’t want to know just that the Finance Managers group has local Administrator rights; I’d like to know who belongs to the Finance Managers group (and thus has local Admin rights). Can you help?

 SCRIPTING GUYS' SOLUTION:

strComputer = "atl-fs-001"
strTestString = "/" & strComputer & "/"

Set objGroup = GetObject("WinNT://" & strComputer & "/Administrators")

For Each objMember In objGroup.Members
    If objMember.Class = "Group" Then
        If Not InStr(objMember.AdsPath, strTestString) Then
            Set objDomainGroup = GetObject(objMember.AdsPath)
            Wscript.Echo objDomainGroup.Name
            For Each objDomainMember in objDomainGroup.Members
                Wscript.Echo objDomainMember.FullName & " (" & objDomainMember.Name & ")"
            Next
            Wscript.Echo
        End If
    End If
Next

MY SOLUTION:

cls
$ErrorActionPreference = "SilentlyContinue"

$adminsGroup = [ADSI]"WinNT://derekm-vpc01/Administrators,group"
$groupMembers = $adminsGroup.PSBase.Invoke("Members") | %{$_.GetType().InvokeMember("Name",'GetProperty', $null, $_, $null)}
foreach ($groupMember in $groupMembers) {
    $gm = $gmMembers = $null
    $gm = Get-QADGroup $groupMember
    if ($gm -ne $null) {
        $gmMembers = Get-QADGroupMember $gm
        $gm.Name
        Write-Host "---------------"
        $gmMembers
    }
}

This one took some research. I got the $groupMembers code from here. I was not really able to find a good 'pure PowerShell' solution to this problem. This script is no cleaner, shorter, or "PowerShell-er' than the VB solution. This script is quick, brute-force, dismissive of errors... in a word... UGLY! But, I am pretty sure it produces the same output.

Tuesday, March 18, 2008

2008-03-18 PowerShell Exercise

I almost didn't bother with this one... There is really no difference between The Scripting Guys' solution and the PowerShell solution. In their article, they talk about how the MS Office Suite stores, displays, and edits this information. Because each Office App does these three things differently (who writes this stuff?!), we have to use Word to retrieve this information, rather than PowerPoint. Gotta love Microsoft!

THE QUESTION:

In PowerPoint user information is stored under the Tools menu. I would like to know if there is a way to extract this data by using a script.

THE SCRIPTING GUYS ANSWER:

Set objWord = CreateObject("Word.Application")
Wscript.Echo "Name: " & objWord.UserName
Wscript.Echo "Initials: " & objWord.UserInitials
objWord.Quit

MY POWERSHELL ANSWER:

$word = New-Object -ComObject Word.Application
$word.UserName
$word.UserInitials
$word = $null

Monday, March 17, 2008

Expanding Our VM Environment

We are adding a third VM Host this week. We are currently using Microsoft Virtual Server 2005 and plan on migrating to Hyper-V later this year. Our Hosts are humble, but functional: Dell PE 860s with 8GB RAM and six NICs total. We use one NIC for LAN access, two for SAN access, and the rest for VM access to our network.

We are also working to implement Virtual Machine Manager, but things aren't as smooth as I would like. I think much of the headaches are due to my lack of knowledge. I am working to remedy that. We have also been running the evaluation version of this... until now. We purchased VMM and I am just installing the licensed version now. The biggest feature I am interested in (which I am not able to get working as of yet) is CheckPoints. I am hoping that something will change with the full version.

We will see...

2008-03-17 PowerShell Exercise

WOW! Do I love PowerShell?! You bet I do. Today's Scripting Guys question is a perfect example of why PowerShell is so powerful and so great.

THE QUESTION:
I’ve been asked to come up with a list of all the groups that have just 1 member or fewer. How can I write a script that will return this information for me?

THE SCRIPTING GUYS ANSWER:

On Error Resume Next

Const ADS_SCOPE_SUBTREE = 2

Set objConnection = CreateObject("ADODB.Connection")
Set objCommand = CreateObject("ADODB.Command")
objConnection.Provider = "ADsDSOObject"
objConnection.Open "Active Directory Provider"
Set objCommand.ActiveConnection = objConnection

objCommand.Properties("Page Size") = 1000
objCommand.Properties("Searchscope") = ADS_SCOPE_SUBTREE

objCommand.CommandText = _
    "SELECT ADsPath, Name FROM 'LDAP://dc=fabrikam,dc=com' WHERE objectCategory='group'"

Set objRecordSet = objCommand.Execute

objRecordSet.MoveFirst

Do Until objRecordSet.EOF
    Set objGroup = GetObject(objRecordSet.Fields("ADsPath").Value)

    i = 0

    For Each strUser in objGroup.Member
        i = i + 1
        If i > 1 Then
            Exit For
        End If
    Next

    If i <= 1 Then
    Wscript.Echo objRecordSet.Fields("Name").Value & " -- " & i
    End If

    objRecordSet.MoveNext
Loop

MY POWERSHELL ANSWER:

foreach ($group in Get-QADGroup) {if ($group.Members.Count -le 1) {Write-Host $group.Name "--" $group.Members.Count}}

This script should be entered on one line. Pretty amazing, huh? In this one-liner, I am using the Quest AD Commandlets, which are fantastic. They make AD work in PowerShell so much easier.

See you again tomorrow.

Sunday, March 16, 2008

Reset Security on Folder Structure with PowerShell

I have been thinking through our new shared folder storage and came across the following dilemma.

As you probably know, when you copy an item form one location to another (same volume or different volume) the item inherits its destination's security settings. However, you have to be more careful when you move an item.

If you move and item to a destination on a different volume, there are actually two steps that take place: a copy and then a delete. The item is copied to the new location (thereby inheriting the security settings of its destination) and then it is deleted from its original location.

But, if you move an item to another location on the same volume, only the pointer to the item is changed. So, the item retains its security settings. This is a HUGE problem for me, one that I was starting to get worried about.

Our User and Group shares are stored on the same volume on our server. It looks something like this:

image In this structure, the 'Share-Admin' security group has Full Control rights to the 'Shares' folder. The 'Groups' folder is shared out and every user gets a drive mapping to that folder. Each folder under 'Groups' get its own security groups assigned it, one for MODify access and one for ReadOnly access (see my entry here).

The 'Users' folder is shared out and every user gets a drive mapping to: \\server\Users\<username>. Each user is given Modify rights to their folder.

These resources all use Access Based Enumeration so each user only sees the folders that they have access to.

It all seemed to be working out quite well, until I realized that moves were possible. If a user has access to multiple folders within 'Groups', they could move items from one folder to another. And, since the source and destination would be on the same volume, the item(s) being moved would take their security settings with them. Not good!

So, what is my solution to this?! I am formulating a solution that I hope will be workable. Of course, I am using PowerShell. I am developing a script that will iterate through all of the child objects within by base folders and reset the security settings so that there are no misplaced or explicit items.

My first go-around was pretty simple.

$ErrorActionPreference = "SilentlyContinue"

$path = 'c:\shares\users\derekm'

$dirs = Get-ChildItem -Path $path -Recurse

foreach ($item in $dirs) {
    $acl = Get-Acl $item.FullName
    $aces = $acl.Access
    foreach ($ace in $aces) {
        if ($ace.IsInherited -eq $FALSE) {
            $removeACE = $acl.RemoveAccessRule($ace)
        }
    }
    $ACLset = Set-Acl $item.FullName $acl
}

$path = $dirs = $acl = $aces = $removeACE = $ACLset = $null

All this does is:

  • Load all child items for a particular folder into an array.
  • For each item
    • Get the ACL
    • Load the ACEs into an array
    • Remove all explicit ACEs
    • Write the modified ACL back to the item

This was fine, except for one thing... If you move an item from one folder to another within the 'Groups' folder structure, the ACEs all show up as inherited. So, if you move FILE.TXT from FOLDERA to FOLDERB, FILE.TXT will have inherited FOLDERA security, even though it now resides in FOLDERB. My script, as listed above, can not fix this. I need another way. With some minor tweaks, I have come up with:

$ErrorActionPreference = "SilentlyContinue"

$path = 'c:\shares\Groups\Folder0013'

$baseACL = Get-Acl $path

$items = Get-ChildItem -Path $path -Recurse

foreach ($item in $items) {
    Set-Acl $item.FullName $baseACL
    $itemACL = Get-Acl $item.FullName
    $itemACEs = $itemACL.Access
    foreach ($itemACE in $itemACEs) {
        if ($itemACE.IsInherited -eq $FALSE) {
            $null = $itemACL.RemoveAccessRule($itemACE)
        }
    }
    Set-Acl $item.FullName $itemACL
}

$path = $baseACL = $items = $item = $itemACL = $itemACEs = $itemACE = $ACLset = $null

The main change here is that I first overwrite the current ACL on each item with that of it's proper parent folder. I then go through and remove all explicit entries, leaving only inherited security settings. Great!

So, this seems to be the raw mechanics that will work for me. Now, I just need to augment this script, allowing it to accept input, work with possible deadly errors, etc.

Saturday, March 15, 2008

Introduction

Allow me to introduce myself (I realize I have not done that yet)...

My name is Derek Mangrum and I work at Central Christian Church of the East Valley in Mesa, AZ. I am the Network and Systems Administrator. Actually, I don't know what my specific title is...

But, I take care of the servers, the network, the Cisco IP phone system on our Gilbert campus, and anything else that comes along.

I am a MCSE (Windows 2003), a CCNA, and am BackupExec 10.x certified. I have been in the computer biz for over 16 years.

My e-mail address is: derekDOTmangrumAT-SYMBOLcccevDOTcom

My first computer was a Commodore 64. My dad bought one, used, from his boss when I was just a kid. He figured that computers were 'up and coming' and wanted to expose his kids to them. Little did he know that he was setting me on a life's course (as far as work goes).

Well, that's about it for now.

Friday, March 14, 2008

Tear-Down Shared Folders

Here is the script to tear down the shared folders/security groups we are using -

###############################################################################################
##
##   DeleteGroup.ps1
##
## Deletes File Shares and associated Security Groups:
##         - Deletes MOD, RO, groups for security assignments to group shares
##         - Deletes the shared folders themselves
##
## If using a TXT file for input, it contains a list of folder names.
##
## TXT File Format:
##   One Column
##       - First row = NAME (used as data element identifier)
##       - Subsequent rows = <name of the folder to be created>  **13 characters or less
##            - Longer names will be truncated to 13 characters!!
## TXT file example:
##    NAME
##    Folder001
##    Folder002
##
## Best if run on GC of Domain
##
###############################################################################################

function get_option {
    # This function allows user to select input type
    #   - single folder entered at command line or .TXT file
    Clear-Host
    Write-Host -ForegroundColor Cyan "WELCOME TO THE 'DELETE SHARED FOLDER(S) SCRIPT'"
    Write-Host "Do you want to enter a single folder name or use a .TXT file for input?"
    Write-Host "  Please enter '1' to type a single folder name"
    Write-Host "  Please enter '2' to use a .TXT input file"
    $script:chosen = Read-Host "1 or 2"
    switch ($script:chosen) {
        1 {
            Write-Host "You selected to enter a single folder name."
            $script:foldername = Read-Host "Enter the folder name"
            Write-Host "You entered '$script:foldername'"
            $script:isBatch = $FALSE
            $script:MayProceed = $TRUE
        }
        2 {
            Write-Host "You selected to enter a .TXT file name for input."
            $script:ListOfFolders = Read-Host "Enter the .TXT filename"
            Write-Host "You entered '$script:ListOfFolders'"
            $script:MayProceed = Test-Path $script:ListOfFolders
            if ($script:MayProceed) {
                Write-Host "File exists... Proceeding"
                $script:isBatch = $TRUE
            }
            else {
                Write-Host -ForegroundColor Red "File does not exist... Try again"
                Start-Sleep -Seconds 2
            }
        }
        default {
            Write-Host -ForegroundColor Red "You did not choose a valid option... Please try again..."
            Start-Sleep -Seconds 3
        }
    }
}

function process_folder {
    param([string]$foldername)
    # Do we actually have a folder name?
    if ($foldername.length -ne 0)
    {
        Write-Host "************************************************"
        Write-Host "Now processing folder name:" $foldername "...."

        # Defining path to folder
        $FullPath = "\\W2003\Groups\" + $foldername

        # Does this folder exist?
        $isThere = Test-Path $FullPath
        if ($isThere)
        {
            # Delete Folder
            Write-Host "Deleting the folder..." -NoNewline
            Remove-Item -path $FullPath -Recurse
            Write-Host "  DONE!"

            # Create variables holding group names
            # If necessary, shorten folder name to 13 characters
            if ($foldername.length  -gt 13) {
                $MODgroup = "s_" + $foldername.substring(0,13) + "_MOD"
                $ROgroup  = "s_" + $foldername.substring(0,13) + "_RO"
            }
            else {
                $MODgroup = "s_" + $foldername + "_MOD"
                $ROgroup  = "s_" + $foldername + "_RO"
            }
            # Deletes Groups
            Write-Host "Deleting the MODify security group..." -NoNewline
            $deletedGrp = Remove-QADObject -Identity $MODgroup -DeleteTree -Force
            Write-Host "  DONE!"
            Write-Host "Deleting the ReadOnly security group..." -NoNewline
            $deletedGrp = Remove-QADObject -Identity $ROgroup -DeleteTree -Force
            Write-Host "  DONE!"
        }
        else
        {
            Write-Host -ForegroundColor Red $foldername "folder does not exist."
        }
    }
}

$chosen = "0"
$ListOfFolders = ""
$foldername = ""
$isBatch = $FALSE
$MayProceed = $FALSE

do {get_option} until ($chosen -eq "1" -or $chosen -eq "2" -and $MayProceed -eq $TRUE)

if ($isBatch) {
    $ListOfFolderNames = Import-CSV $ListOfFolders
    foreach ($folder in $ListOfFolderNames) {
        $foldername = $folder.NAME
        process_folder $foldername
    }
}
else {
    process_folder $foldername
}

Write-Host "All Finished. Thank you."

New Shared Folders

We are working on implementing a new file server at work. That means new user folder and new group folders. Our current 'shared' storage is a bit of a mess, security-wise.

In creating our new file server, we came up with the following security model.

For USER shares: 'Share Admins' have full control; each user has modify rights to his/her own folder.

For GROUP shares: 'Share Admins' have full control; each group folder has two security groups in its ACL, one with MODify rights and one with ReadOnly rights.

Users get a drive mapped to their USER share and a drive mapped to the 'Groups' folder (which contains all of the individual folders described above.

We are using Access Based Enumeration on this server, so users only see the folders they have access to.

I quickly realized that migrating all of our shared data to the new file structure was going to be a huge chore. Specifically, I was not looking forward to having to:

- Create the shared folder
- Create the two security groups in AD
- Modify the ACL on the folder, assigning modify and readonly rights as appropriate.

This doesn't sound like a big deal, until you think about having to do this for dozens of folders! So, I wrote a PowerShell script to do this for me. This script is by no means perfect. I am sure that there are scripters out there who will cringe. But, it gets the job done. I am tweaking it and welcome any suggestions.

Some points:

  • The 30-Second Pause: I have found that, without the pause, when the ACL is getting applied, the server does not always see the security groups in AD. By pausing for a bit, it seems that AD changes have time to propagate/refresh/whatever. I am still looking for a better solution.
  • I also have a 'tear-down' script that deletes the security groups and folders. I will publish this later.

Here is the script:

###############################################################################################
##
##   MakeGroup.ps1
##
## Creates File Shares and associated Security Groups:
##         - Creates MOD, RO, groups for security assignments to group shares
##         - Creates the shared folders themselves
##
## If using a TXT file for input, it contains a list of folder names.
##
## TXT File Format:
##   One Column
##       - First row = NAME (used as data element identifier)
##       - Subsequent rows = <name of the folder to be created>
##            - Longer names will be truncated to 13 characters for Security Group creation!!
## TXT file example:
##    NAME
##    Folder001
##    Folder 002
##  Folder for Group 3
##
## Best if run on GC of Domain
##
###############################################################################################

function get_option {
    # This function allows user to select input type
    #   - single folder entered at command line or .TXT file
    Clear-Host
    Write-Host -ForegroundColor Cyan "WELCOME TO THE 'CREATE NEW SHARED FOLDER(S) SCRIPT'"
    Write-Host "Do you want to enter a single folder name or use a .TXT file for input?"
    Write-Host "  Please enter '1' to type a single folder name"
    Write-Host "  Please enter '2' to use a .TXT input file"
    $script:chosen = Read-Host "1 or 2"
    switch ($script:chosen) {
        1 {
            Write-Host "You selected to enter a single folder name."
            $script:foldername = Read-Host "Enter the folder name"
            Write-Host "You entered '$script:foldername'"
            $script:isBatch = $FALSE
            $script:MayProceed = $TRUE
        }
        2 {
            Write-Host "You selected to enter a .TXT file name for input."
            $script:ListOfFolders = Read-Host "Enter the .TXT filename"
            Write-Host "You entered '$script:ListOfFolders'"
            $script:MayProceed = Test-Path $script:ListOfFolders
            if ($script:MayProceed) {
                Write-Host "File exists... Proceeding"
                $script:isBatch = $TRUE
            }
            else {
                Write-Host -ForegroundColor Red "File does not exist... Try again"
                Start-Sleep -Seconds 2
            }
        }
        default {
            Write-Host -ForegroundColor Red "You did not choose a valid option... Please try again..."
            Start-Sleep -Seconds 3
        }
    }
}

function assign_rights {
    # This function assigns rights to the folder
    param([string]$Rights, [string]$GroupName)
    $acl = get-acl $FullPath
    $Inherit = [Security.AccessControl.InheritanceFlags] "ContainerInherit, ObjectInherit"
    $Prop = [Security.AccessControl.PropagationFlags] "None"
    $NewRule = new-object Security.AccessControl.FileSystemAccessRule $GroupName, $Rights, $Inherit, $Prop, Allow
    $modified = $FALSE
    $modded = $acl.ModifyAccessRule("Add", $NewRule, [ref]$modified)
    set-acl -path $FullPath -AclObject $acl
    if ($modded)
    {
        Write-Host "ACL for $GroupName successfully applied."
    }
    else
    {
        Write-Host -ForegroundColor Red "WARNING!!! ACL for $GroupName failed to apply!"
    }
}

function process_folder {
    param([string]$foldername)
    # This Function creates folder and groups
    # Do we actually have a folder name?
    if ($foldername.length -ne 0) {
        Write-Host "************************************************"
        Write-Host "Now processing folder name:" $foldername "..."

        # Defining path to folder
        $FullPath = "\\W2003\Groups\" + $foldername
        #Does this folder already exist?
        $isThere = Test-Path $FullPath   
        if ($isThere) {
            #Folder already exists. Abort operation.
            Write-Host -ForegroundColor Red "The folder $FullPath already exists and is being skipped. Please verify the folder name."
        }
        else {
            # Create variables holding group names and descriptions
            # If necessary, shorten folder name to 13 characters
            if ($foldername.length  -gt 13) {
                $MODgroup = "s_" + $foldername.substring(0,13) + "_MOD"
                $MODdesc = "$foldername MODify security group"
                $ROgroup  = "s_" + $foldername.substring(0,13) + "_RO"
                $ROdesc = "$foldername ReadOnly security group"
            }
            else {
                $MODgroup = "s_" + $foldername + "_MOD"
                $MODdesc = "$foldername -- MODify security group"
                $ROgroup  = "s_" + $foldername + "_RO"
                $ROdesc = "$foldername -- ReadOnly security group"
            }
            # Defines OU location for Security Group Creation
            $GroupContainer = "OU=Security Groups,OU=Groups,OU=Resources,DC=mydom,DC=local"
            # Create Groups
            Write-Host "Creating MODify security group for this folder..." -NoNewline
            $null = New-QADGroup -ParentContainer $GroupContainer -name $MODgroup -samAccountName $MODgroup -GroupType 'security' -GroupScope 'GLOBAL' -description $MODdesc
            Write-Host "  DONE!"
            Write-Host "Creating ReadOnly security group for this folder..." -NoNewline
            $null = New-QADGroup -ParentContainer $GroupContainer -name $ROgroup -samAccountName $ROgroup -GroupType 'security' -GroupScope 'GLOBAL' -description $ROdesc
            Write-Host "  DONE!"

            # Create Folder
            Write-Host "Creating folder under \\W2003\Groups ..." -NoNewline
            $null = New-Item -path \\W2003\Groups -name $foldername -type directory
            Write-Host "  DONE!"
            # Pause for a bit, otherwise ACL modification may balk for not finding the Group in AD
            Write-Host "Pausing processing for 30 seconds..."
            Start-Sleep -s 10
            Write-Host "20 more seconds to go..."
            Start-Sleep -s 10
            Write-Host "10 more seconds to go..."
            Start-Sleep -s 10
            Write-Host "I know I need to find a better way to do this step... but... Moving On!"
            # Assign rights to the folder for the groups
            # MODIFY RIGHTS
            Write-Host "Assigning MODify ACL entries for this folder to the MOD group..."
            assign_rights "Modify" $MODgroup
            #READONLY RIGHTS
            Write-Host "Assigning ReadOnly ACL entries for this folder to the RO group..."
            assign_rights "Read" $ROgroup
        }
    }
}

$chosen = "0"
$ListOfFolders = ""
$foldername = ""
$isBatch = $FALSE
$MayProceed = $FALSE

do {get_option} until ($chosen -eq "1" -or $chosen -eq "2" -and $MayProceed -eq $TRUE)

# Process folder name list - batch or not
if ($isBatch) {
    $ListOfFolderNames = Import-CSV $ListOfFolders
    foreach ($folder in $ListOfFolderNames) {
        $foldername = $folder.NAME
        process_folder $foldername
    }
}
else {
    process_folder $foldername
}

Write-Host "All Finished. Thank you."

PowerShell Exercise

In case you haven't heard, Microsoft has a new Management Tool out. It's getting a lot of buzz!  :-)

The Scripting Guys (one of my favorite resources) post a script each day (or so). Most of these are in VBscript, from what I have seen.

As a daily exercise, I am translating their non-PowerShell scripts into PowerShell. I thought I would post my work here.

Please feel free to comment on my solutions. I am not spending a ton of time tightening them up. And, as a complete beginner, I am sure there are going to be better ways to do what I do. So, if you have comments, I would love to hear them.

With all of that said, here is today's script --

This script polls your computer for removable drives and copies files to that drive. This could be used to automate backups to a USB key or some such task.

Here is the Scripting Guys solution:

Const OverwriteExisting = TRUE

Set objFSO = CreateObject("Scripting.FileSystemObject")

strComputer = "."

Set objWMIService = GetObject("winmgmts:\\" & strComputer & "\root\cimv2")

Set colDrives = objWMIService.ExecQuery _ ("Select * From Win32_LogicalDisk Where DriveType = 2")

For Each objDrive in colDrives
    strDrive = objDrive.DeviceID
    strTarget = strDrive & "\" 
    objFSO.CopyFile "C:\Scripts\Test.txt", strTarget, OverWriteExisting
Next

Here is my PowerShell version:

# Find removable drives and copy file(s) to it

# Get list of removable drives
$colDrives = Get-WmiObject -Namespace "root/cimv2" -Query "Select * From Win32_LogicalDisk Where DriveType = 2"

# Cycle through drives and copy file
foreach ($objDrive in $colDrives) {
    $strTarget = $objDrive.DeviceID + "\"
    Copy-Item "C:\scripts\Test.txt" -Destination $strTarget -Force
}

Additional Info

My photo
email: support (AT) mangrumtech (DOT) com
mobile: 480-270-4332