SignalWarrant.com – http://www.signalwarrant.com Signal Warrant Officer stuff, Powershell scripts, SCCM stuff, other tech crap and whatever else I think the world needs to know. Mon, 22 Oct 2018 01:20:35 +0000 en-US hourly 1 38057602 Visual Studio Code and Azure DevOps (formerly VisualStudio.com) Integration Step-by-Step http://www.signalwarrant.com/visual-studio-code-and-azure-devops-formerly-visualstudio-com-integration-step-by-step/ http://www.signalwarrant.com/visual-studio-code-and-azure-devops-formerly-visualstudio-com-integration-step-by-step/#comments Mon, 22 Oct 2018 01:20:35 +0000 http://www.signalwarrant.com/?p=1269 My previous video walked through the process of using the old Visual Studio Team Services VS Code extension. That extension has now been deprecated. With the latest version or VS Code (I installed v 1.28.1) you should see the Azure Repos extension baked into VS Code. The only other requirement is a local Git repo (I installed 2.19.1). Once you have all the software installed it’s much easier to get this going, all the steps are below as well as in the video.

Step 1: Download and install VS Code (Download Link: https://code.visualstudio.com/download). The default install should work fine.

Step 2: Download and install Git (Download Link: https://git-scm.com/download). Choose all the defaults except change VS Code to the default editor. That’s actually optional but I did it.

Step 3: Access your VisualStudio.com / Azure Repos repo and click the “Clone to VS Code” link… see the video for details. You’ll be asked where in the file structure to clone the repo and prompted for your VisualStudio.com / Azure Repos credentials.

At this point you should be all set. Make sure you test changing a file and committing it to the cloud just to make sure. It’s a much easier process than before but I wasn’t able to find instructions on the interweb thus, this video and writeup.

Thanks for watching / reading.

]]>
http://www.signalwarrant.com/visual-studio-code-and-azure-devops-formerly-visualstudio-com-integration-step-by-step/feed/ 2 1269
Enterprise Security: How to configure and use Group Managed Service Accounts http://www.signalwarrant.com/enterprise-security-how-to-configure-and-use-group-managed-service-accounts/ http://www.signalwarrant.com/enterprise-security-how-to-configure-and-use-group-managed-service-accounts/#respond Mon, 10 Sep 2018 14:48:42 +0000 http://www.signalwarrant.com/?p=1243 I routinely see organizations big and small still using “regular” Active Directory user accounts as service accounts. Typically, they have the password for those service accounts set to never expire or an alternate password policy that only requires the password is changed yearly. If your organization is managing service accounts like this you are only increasing the potential for exploitation when a nefarious actor gets inside your enterprise. It’s a matter of WHEN not if.

With the introduction of Windows Server 2012, Microsoft introduced Group Managed Service Accounts to address this specific situation. Group Managed Service accounts (gMSA) are an upgrade from the Managed Service accounts that were available in Windows Server 2008 in that gMSA can be used on multiple servers. There is no need to create a specific service account for each server although, your internal policies may dictate otherwise.

Why use gMSA?

  • The Password is managed in Active Directory (AD) and is changed every 30 days by default.
  • Because the password is managed by AD, no human will ever know the password.
  • gMSA passwords are 240 bytes long so they are complex.
  • gMSAs are not permitted to logon interactively.

How do I configure and use a gMSA?

The code below is everything you need to get started with gMSAs. Also, take a look at the video below for a more in-depth walk-through of the process.

Do yourself a favor… get rid of legacy service accounts. It’s one of those things you can do to incrementally harden your enterprise.

<# 
Step 1 - (on a non-DC)
Add-WindowsFeature RSAT-AD-PowerShell

Step 2 - Create a Security group and add all the hostnames you will use the gMSA on. 
These are the computers permitted to retrieve the password from AD
#>

$gMSA_Name = 'svc_sql'
$gMSA_FQDN = 'svc_sql.hall.test'
# Getting all the hostnames from the group
$gMSA_HostNames = Get-ADGroupMember -Identity gMSAs | Select-Object -ExpandProperty Name

# Add the Rootkey
Add-KDSRootKey -EffectiveTime (Get-Date).AddHours(-10)

# Get the principal for the computer account(s) in $gMSA_HostNames
$gMSA_HostsGroup = $gMSA_HostNames | ForEach-Object { Get-ADComputer -Identity $_ }

# Create the gMSA
New-ADServiceAccount -Name $gMSA_Name -DNSHostName $gMSA_FQDN -PrincipalsAllowedToRetrieveManagedPassword $gMSA_HostsGroup

# Install on the target machine
Install-ADServiceAccount svc_sql

# Test the installation
Test-ADServiceAccount svc_sql

# To remove the gMSA
Remove-ADServiceAccount -Identity svc_sql

]]>
http://www.signalwarrant.com/enterprise-security-how-to-configure-and-use-group-managed-service-accounts/feed/ 0 1243
Quickly deploy LAB Virtual Machines with the AutomatedLab PowerShell Framework http://www.signalwarrant.com/quickly-deploy-lab-virtual-machines-with-the-automatedlab-powershell-framework/ http://www.signalwarrant.com/quickly-deploy-lab-virtual-machines-with-the-automatedlab-powershell-framework/#respond Thu, 30 Aug 2018 10:52:48 +0000 http://www.signalwarrant.com/?p=1235 My Requirements:

  • Use PowerShell to minimize the time spent on creating VMs.
  • Install ADDS along with the VM creation.
  • Use a differencing disk for all the VMs to save disk space.

In the spirit of kickstarting this channel again, I needed to spin up a few Virtual Machines to facilitate PowerShelling. I was going to go through the process of using the Hyper-V management Cmdlets when I stumbled upon the AutomatedLab framework on GitHub (https://github.com/AutomatedLab/AutomatedLab). I had never used this framework before, so a little trial and error was in order.

Kudos to the developers, they have included an assortment of sample Lab scripts that you can easily modify to suit your needs and get going fairly quickly without a lot of hassle. The framework also includes the ability to install some applications like Exchange, SQL and a few others. There are many more features included that I haven’t started playing with yet. It’s worth having a look, particularly if you find yourself doing things like demoing applications or testing GPO settings on a regular basis.

 

The code I used to spin out my lab is below.

$labName = 'TestLab'
$domain = 'test.lab'
$adminAcct = 'Administrator'
$adminPass = 'YourPasswordHere'
$labsources = "D:\LabSources"

#Create an empty lab template and define where the lab XML files and the VMs will be stored
New-LabDefinition -Name $labName -DefaultVirtualizationEngine HyperV

#Network definition
Add-LabVirtualNetworkDefinition -Name $labName -AddressSpace 10.1.1.0/24

#Domain definition with the domain admin account
Add-LabDomainDefinition -Name $domain -AdminUser $adminAcct -AdminPassword $adminPass
Set-LabInstallationCredential -Username $adminAcct -Password $adminPass

#Default parameter values for all the machines
$PSDefaultParameterValues = @{
    'Add-LabMachineDefinition:Network' = $labName
    'Add-LabMachineDefinition:ToolsPath'= "$labSources\Tools"
    'Add-LabMachineDefinition:IsDomainJoined'= $true
	'Add-LabMachineDefinition:DnsServer1'= '10.1.1.1'
    'Add-LabMachineDefinition:OperatingSystem'= 'Windows Server 2016 Datacenter (Desktop Experience)'
    'Add-LabMachineDefinition:DomainName'= $domain
    'Add-LabMachineDefinition:Memory'= 4GB
    'Add-LabMachineDefinition:Processors'= 1
    'Add-LabMachineDefinition:MinMemory'= 1GB
    'Add-LabMachineDefinition:MaxMemory'= 4GB
    'Add-LabMachineDefinition:EnableWindowsFirewall'= $false
}

# Root Domain Controller
Add-LabMachineDefinition -Name TestDC -IpAddress 10.1.1.1 -Roles RootDC

# Test SVR 1
Add-LabMachineDefinition -Name TestSVR1 -IpAddress 10.1.1.2

# Test SVR 2
Add-LabMachineDefinition -Name TestSVR2 -IpAddress 10.1.1.3

Install-Lab

Show-LabDeploymentSummary -Detailed

 

Enjoy

]]>
http://www.signalwarrant.com/quickly-deploy-lab-virtual-machines-with-the-automatedlab-powershell-framework/feed/ 0 1235
I’m resurrecting the YouTube channel. http://www.signalwarrant.com/im-resurrecting-the-youtube-channel/ http://www.signalwarrant.com/im-resurrecting-the-youtube-channel/#comments Wed, 22 Aug 2018 00:44:16 +0000 http://www.signalwarrant.com/?p=1229 I’m going to resurrect the YouTube channel and start making videos as time allows. That said if you have something specific you would like to see contact me using one of the methods below and I’ll see if I can work it in at some point.

More to follow…

 

]]>
http://www.signalwarrant.com/im-resurrecting-the-youtube-channel/feed/ 1 1229
Advanced Auditing with PowerShell Desired State Configuration Manager (DSC) http://www.signalwarrant.com/advanced-auditing-with-powershell-desired-state-configuration-manager-dsc/ http://www.signalwarrant.com/advanced-auditing-with-powershell-desired-state-configuration-manager-dsc/#respond Wed, 06 Dec 2017 03:10:05 +0000 http://www.signalwarrant.com/?p=1106 Greetings interweb. It’s been a while but I’m back with a new video finally.

This video focuses on Desired State Configuration Manager (DSC) and how to configure Advanced Auditing using DSC. You can certainly configure DSC using a group policy object (GPO). My use case for this is if you have a public facing web server that is in a DMZ outside the trusted side of your network. In that case, you probably don’t want that public facing Server on your domain but you also what to audit it and patch it and all of those other security processes you might run on any other server on the trusted side of the network.

For this exercise we will install DSC on a domain joined server (Pull Server) and configure that public facing server that’s in the DMZ to pull configuration from the Pull server.

Part 1: The Pull Server

All of this code is executed on the domain-joined Pull server.

Step 1: Install the required DSC modules, we need the 3 below. You can get all of them from powershellgallery.com

These 2 are for DSC itself

Install-Module -Name PSDscResources

Install-Module -Name xPSDesiredStateConfiguration

 

This module allows us to configure Advanced Auditing with DSC

Install-Module -Name AuditPolicyDsc

Step 2: You need a certificate on the Pull Server and any client that will connect to the pull server. This is the mechanism used to authenticate the client since it’s not in the domain.

In my case, I just installed certificate Services on my Pull Server. It’s literally the default installation, all you need is a Root CA. You could also manually create a certificate without installing certificate services but I found this more problematic than it was worth. Certificate services was the path of least resistance for me.

Certificate Services Step by Step: https://technet.microsoft.com/en-us/library/cc772393(v=ws.10).aspx

Step 3: Now we need to export the certificate so we can import it on the DSC client side. You’ll need to import the certificate on any non-domain DSC client. All of this code is from http://duffney.io/Configure-HTTPS-DSC-PullServerPSv5, I changed some of the information to fit my naming scheme, other than that it’s almost identical.

# Code Source: http://duffney.io/Configure-HTTPS-DSC-PullServerPSv5
$inf = @"
[Version] 
Signature="`$Windows NT`$"

[NewRequest]
Subject = "CN=DC, OU=IT, O=Signalwarrant, L=Augusta, S=SE, C=US"
KeySpec = 1
KeyLength = 2048
Exportable = TRUE
FriendlyName = PSDSCPullServerCert
MachineKeySet = TRUE
SMIME = False
PrivateKeyArchive = FALSE
UserProtected = FALSE
UseExistingKeySet = FALSE
ProviderName = "Microsoft RSA SChannel Cryptographic Provider"
ProviderType = 12
RequestType = PKCS10
KeyUsage = 0xa0
"@

$infFile = "$env:HOMEDRIVE\temp\certrq.inf"
$requestFile = "$env:HOMEDRIVE\temp\request.req"
$CertFileOut = "$env:HOMEDRIVE\temp\certfile.cer"

mkdir $env:HOMEDRIVE\temp
$inf | Set-Content -Path $infFile

& certreq.exe -new "$infFile" "$requestFile"

# Make sure the DC matches everywhere
& certreq.exe -submit -config DC.signalwarrant.local\Signalwarrant-DC-CA -attrib 'CertificateTemplate:WebServer' "$requestFile" "$CertFileOut"

& certreq.exe -accept "$CertFileOut"

## Copy the certfile to any clients and install the Cert to Local Machine

Step 4: Now we can install Desired State Configuration Manager on the Pull Server. Almost all of this code is identical to what you will find on MSDN with a few modifications to match my naming scheme. See the video above for a more detailed explanation of what’s going on here.

# 4. Configure the Pull Server
Configuration DscPullServer
{
  param
  (
    [string[]]$NodeName = 'localhost',

    [ValidateNotNullOrEmpty()]
    [string] $certificateThumbPrint,

    [Parameter(Mandatory)]
    [ValidateNotNullOrEmpty()]
    [string] $RegistrationKey
  )

  Import-DSCResource -ModuleName PSDesiredStateConfiguration
  Import-DSCResource -ModuleName xPSDesiredStateConfiguration

    Node $NodeName
    {
        WindowsFeature DSCServiceFeature
        {
        Ensure = 'Present'
        Name = 'DSC-Service'
        }

        xDscWebService PSDSCPullServer
        {
        Ensure = 'present'
        EndpointName = 'PSDSCPullServer'
        Port = 8080
        PhysicalPath = "$env:SystemDrive\inetpub\PSDSCPullServer\"
        CertificateThumbPrint = $certificateThumbPrint
        # This is where the packaged modules needed by client go
        ModulePath = "$env:PROGRAMFILES\WindowsPowerShell\DscService\Modules"
        # Client .MOF and Checksum files go here
        ConfigurationPath = "$env:PROGRAMFILES\WindowsPowerShell\DscService\Configuration"
        State = 'Started'
        DependsOn = '[WindowsFeature]DSCServiceFeature'
        UseSecurityBestPractices = $true
        }
        
        File RegistrationKeyFile
        {
        Ensure = 'Present'
        Type = 'File'
        DestinationPath = "$env:ProgramFiles\WindowsPowerShell\DscService\RegistrationKeys.txt"
        Contents = $RegistrationKey
        }
    }

}

$guid = [guid]::newGuid()

$cert = Get-ChildItem -Path Cert:\LocalMachine\My | Where-Object {$_.FriendlyName -eq 'PSDSCPullServerCert'}

DscPullServer -certificateThumbPrint $cert.Thumbprint -RegistrationKey $guid -OutputPath $env:HOMEDRIVE\dsc

Start-DscConfiguration -Path $env:HOMEDRIVE\dsc -Wait -Verbose
 
# Server URL, copy and paster the Server URL into a web browser
# You should get some XML output if everything is working ok.
# https://YourServerNameHere:8080/PSDSCPullServer.svc/

Once the DSC installation is complete, check the web address for the Pull Server. It should be, https://<your server name>dc:8080/PSDSCPullServer.svc/. If you get some XML output your pull server is good to go… theoretically.

Step 5: Create the MOF files for the DSC client. This code is pretty much the same code as is in the example on the module github repo. https://github.com/PowerShell/AuditPolicyDsc

# 5. Create the client MOFs
# https://github.com/PowerShell/AuditPolicyDsc

Configuration AuditPolicyCsv {
    param(
        [String] $NodeName = 'SVR'
    )    
   
    Import-DscResource -ModuleName AuditPolicyDsc

    Node $NodeName {
        AuditPolicyCsv auditPolicy {
            IsSingleInstance = 'Yes'
            CsvPath = "C:\audit.csv"
        }
    }
}

#
AuditPolicyCsv

#
New-DscChecksum -path '.\AuditPolicyCSV\SVR.mof'

Step 6: We need to package the MOFs and DSC resources and put them in the correct file structure on the Pull Server in order for the clients to pull what’s needed. The $MOFPath in this script is where you created the MOF files in the script above, you may need to change this if you used a different file location.

DSC Pull Server Mission complete!

Part 2: The DSC Client

Step 1: Configure the Local Configuration Manager on the client. Before you run the code below, it’s worth testing the connectivity to the Pull server website. In my case, this is my site to the Pull Server https://10.0.10.1:8080/PsDscPullserver.svc. I use the IP address for the Pull server because the client is not on the Domain so DNS is not going to work unless you add a manual entry in the Host file on the client.

# Run on the target node
[DSCLocalConfigurationManager()]
Configuration LCMConfig {
    Node SVR {
        Settings {
            ConfigurationMode = 'ApplyAndAutoCorrect'
            RefreshMode = 'Pull'
        }

        ConfigurationRepositoryWeb PullServer {
            ServerURL = 'https://10.0.10.1:8080/PsDscPullserver.svc'
            AllowUnsecureConnection = $false
            RegistrationKey = 'ab5f4916-a937-4f99-ae72-8b9db3dd8a60'
            ConfigurationNames = @('SVR')
        }

        ResourceRepositoryWeb PullServerModules {
            ServerURL = 'https://10.0.10.1:8080/PsDscPullserver.svc'
            AllowUnsecureConnection = $false
            RegistrationKey = 'ab5f4916-a937-4f99-ae72-8b9db3dd8a60'
        }
    }
}

LCMConfig

Set-DscLocalConfigurationManager -ComputerName $env:COMPUTERNAME -Path '.\LCMConfig' -Verbose

Get-DSCLocalConfigurationManager

# This will show you the location of the CSV file that configuration is using
Get-DscConfiguration

Step 2: If you run Get-DscConfiguration and it comes back ok, the client should be properly registered with the Pull server.

Step 3: Now that the Pull server and the client are working, we need to figure out what the format for the CSV file is that we want to use to configure Advanced Auditing. See the video to understand how I figured out the format. My CSV is linked below.

Step 4: Once you have the CSV file complete and in the correct file location run the flowing to refresh the DSC config.

# Force a DSC refresh
Start-DscConfiguration –UseExisting –Verbose –Wait

Run this to verify the Audit Policy is exactly the way you had it configured in the CSV.

# Check the AuditPol config took
auditpol.exe /get /category:* /r |
ConvertFrom-Csv |
Select-Object -Property Subcategory*,*lusion*

Hope you enjoyed that tutorial… may the PowerShell force be with you all. 🙂

 

 

 

 

]]>
http://www.signalwarrant.com/advanced-auditing-with-powershell-desired-state-configuration-manager-dsc/feed/ 0 1106
Convert text files to PDF then merge PDFs in bulk with PowerShell and iTextSharp http://www.signalwarrant.com/convert-to-and-merge-pdfs-with-itextsharp-and-powershell/ http://www.signalwarrant.com/convert-to-and-merge-pdfs-with-itextsharp-and-powershell/#respond Tue, 15 Aug 2017 00:21:43 +0000 http://www.signalwarrant.com/?p=1038 To demonstrate this process, I will show you how I export all the PowerShell help and convert it to PDF files that I use instead of looking at the help files in the PowerShell Console. I realize you can view the PowerShell help files in the console and online. The purpose of this script is to demonstrate how I solved what I viewed as a problem for me. You may be just fine looking through the help files in the console or online, as with any PowerShell script anyone hacks together, your mileage may vary.

I have split the code into 3 functions; 1 to export the help files to text files, 1 to convert those text files to PDFs, and 1 to merge all the PDFs in a given folder into 1 PDF. This process requires the iTextSharp .net library which you can download here (https://github.com/itext/itextsharp). This is probably one of the more basic implementations of the iTextSharp library, it has a lot of functionality that I didn’t need to accomplish my mission. You can find more technical information on iTextSharp here (https://afterlogic.com/mailbee-net/docs-itextsharp/)

The Problem

For me, I hate looking through the help files in the console… for a few reasons.

  1. By default, the font is too small for me to see well. Old people problems I know, it is what it is. I find reading a pdf file much easier on the eye. I dump all the help files on a network accessible share that I can quickly access on any machine in my network.
  2. I can highlight, comment and bookmark the pdf files as needed. This may require the full version of Acrobat Pro or an equivalent, I’m not sure. My organization has Acrobat Pro so this isn’t an issue for me. Using highlighting, comments and bookmarks, I can make the help files more of a quick reference, it just works for me.
  3. I frequently work on Servers that are not accessible to the larger internet. Meaning, I can’t update help files for those machines unless I do it manually. Instead, I keep these pdf files on an external hard drive that I carry around anyway.

I know the help files are updated occasionally but it’s not often enough to make much of a difference. I usually update my help files a couple times a year. All that said, below is the code I use to make all the txt files and convert them to pdfs. I have also combined the help for each cmdlet in a module into a single pdf that I have made available here on signalwarrant.com. Hit the downloads link and use if you think you can get some benefit from them.

This script exports all the help for each cmdlet to the specified file path.

Function Export-PShelp {
  <#
    .SYNOPSIS
    Exports the -full help for each CMDlet available on your computer.
    
    .DESCRIPTION
    Gets all the Modules available on the computer, loops through those Modules to retrieve each CMDlet name as well as create a folder for each module. Loops through each CMDlet in each module and exports the -full help for each to the $filepath\$modulename

    .PARAMETER filePath
    -filePath: The folder that you're exporting all the help files to on your local hard drive.

    .EXAMPLE
    Export-PShelp -filePath 'c:\helpfiles'

    .NOTES

    .LINK
    
  #>

Param(
  [Parameter(
    Mandatory=$True,HelpMessage='Path where you want the helpfiles to be exported',
    Position=1
  )][string]$filePath
)
    
If(!(Test-Path -Path "$filePath\")){
  New-Item -Path $Filepath -Name Help -ItemType Directory
}

# You get some errors if a module has no help so I just 
# turned error reporting off.    
$ErrorActionPreference = 'silentlycontinue'

# Get each module name and loop through each to retrieve cmdlet names
$modules = Get-Module -ListAvailable | 
    Select-Object -ExpandProperty Name

ForEach ($module in $modules){
  # Creates a folder for each Module
  If (!(Get-Item -Path "$filePath\$($module)")){
    New-Item -ItemType Directory -Path "$filePath\$($module)"
  }

  # Get the CMDLet names for each CMDlet in the Module
  $modulecmdlets = Get-Command -Module $module | 
    Select-Object -ExpandProperty name
        
    ForEach ($modulecmdlet in $modulecmdlets){
      Get-Help -Name $($modulecmdlet) -Full | 
      Out-File -FilePath "$filePath\$($module)\$($modulecmdlet).txt"
    }  
  }
}


# SIG # Begin signature block
# MIID7QYJKoZIhvcNAQcCoIID3jCCA9oCAQExCzAJBgUrDgMCGgUAMGkGCisGAQQB
# gjcCAQSgWzBZMDQGCisGAQQBgjcCAR4wJgIDAQAABBAfzDtgWUsITrck0sYpfvNR
# AgEAAgEAAgEAAgEAAgEAMCEwCQYFKw4DAhoFAAQUCvPzAOx/EBAsNUVRsPAVBFrD
# k+ugggIHMIICAzCCAWygAwIBAgIQNLlZ1S66VKZIK0GrwNPiZjANBgkqhkiG9w0B
# AQUFADAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbTAeFw0xNzA4MTMwMTQ3
# MDdaFw0yMTA4MTMwMDAwMDBaMBwxGjAYBgNVBAMMEXNpZ25hbHdhcnJhbnQuY29t
# MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCzRJtZQFTGFuO1em//vTUA9P1D
# KxGzFKXh1smpLVMVc4kIH8IcZduB4g/Zfd3bfk2qPudoaNCz+BWtkISMvKCRsEj6
# wEOzXoEvCJKusEmIH8S9YBiY70uoFSvwn/HR3BoItGPotnGtk69Uc7Ldvm7NQjRL
# z3OJp7xbj5bIkhuDmQIDAQABo0YwRDATBgNVHSUEDDAKBggrBgEFBQcDAzAdBgNV
# HQ4EFgQU9TJKFdTrOuigTVLwltUWgnE6tMowDgYDVR0PAQH/BAQDAgeAMA0GCSqG
# SIb3DQEBBQUAA4GBAIP0zowgc+EYPB5BeVm+L0jkqfiqEvQgSIdeSeYXSe6tLGZ+
# rtOlp6XJ+xBSWnpIl7oftc13zDY5+j/++WBuY1y9aM48zzhUxnfaou48u+wXpwMs
# FkhPouje4qfdF7dJzM+4SeA0rNPbG+7jEqYxAmBOS0U67vFK5ISDntdxQTp6MYIB
# UDCCAUwCAQEwMDAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbQIQNLlZ1S66
# VKZIK0GrwNPiZjAJBgUrDgMCGgUAoHgwGAYKKwYBBAGCNwIBDDEKMAigAoAAoQKA
# ADAZBgkqhkiG9w0BCQMxDAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYK
# KwYBBAGCNwIBFTAjBgkqhkiG9w0BCQQxFgQUv9MSg0E0NpHEf9bLHjVr5FaOU94w
# DQYJKoZIhvcNAQEBBQAEgYBCOXC5bFQnCl0tec/+TkQoPSYN2CIO2v9xX3Ws8QtA
# urBEQQEG0fLbqVK0L+O4piMQfwknsJ402MzvXKJ0LmhwVy0dokab1lu67G9tMCNu
# GlT4en3bmGy1LCTW+/aZUSkugCnGKLVNgOIr9juTZsLou3Hg65BRdc+Cc/6Azj9o
# Vg==
# SIG # End signature block

 

The script below will convert files to PDF.

Function ConvertTo-PDF {
  <#
    .DESCRIPTION
    Convert 1 or many files to PDFs
    
    .PARAMETER filePath
    -filePath: The path to the folder that contains all your text files

    .PARAMETER dllPath
    -dllPath: The Path to the iTextSharp.DLL file

    .EXAMPLE
    ConverTTo-PDF -filePath 'C:\help' -filetype 'txt' -dllPath 'C:\itextsharp.dll'

    .REQUIREMENTS
    - iTextSharp 5.5.10 .NET Library
    - You may have to Set execution policy to less restrictive policy

    .LINK
    iTextSharp .NET library: https://github.com/itext/itextsharp/releases/tag/5.5.11
  #>


Param(
  [Parameter(
    Mandatory=$True,
    HelpMessage='Add the path you want to save help to EX. c:\help'
    )][string]$filePath,

  [Parameter(
    Mandatory=$True,HelpMessage='What file type to convert'
    )][string]$filetype,

  [Parameter(
    Mandatory=$True,HelpMessage='path to the itextsharp.dll file EX. c:\itextsharp.dll'
    )][string]$dllPath
)

Begin{
  Try{
    Add-Type -Path $dllPath -ErrorAction Stop
  }
  Catch{
    Throw "Could not load iTextSharp DLL from $($dllPath).`nPlease check that the dll is located at that path."
  }
}

Process{
  $txtFiles = Get-ChildItem -Path $filePath -Recurse -Filter "*.$filetype"

  ForEach ($txtFile in $txtFiles){
    $path = "$($txtFile.DirectoryName)\$($txtFile.BaseName).pdf"
    $doc = New-Object -TypeName iTextSharp.text.Document
    $fileStream = New-Object -TypeName IO.FileStream -ArgumentList ($path, [System.IO.FileMode]::Create)
    [iTextSharp.text.pdf.PdfWriter]::GetInstance($doc, $filestream)
    [iTextSharp.text.FontFactory]::RegisterDirectories()

    $paragraph = New-Object -TypeName iTextSharp.text.Paragraph
    $paragraph.add(( Get-Content -Path $($txtFile.FullName) |
        ForEach-Object {
            "$_`n"
        })) | Out-Null
    $doc.open()
    $doc.add($paragraph) | Out-Null
    $doc.close()
}

}
}
# SIG # Begin signature block
# MIID7QYJKoZIhvcNAQcCoIID3jCCA9oCAQExCzAJBgUrDgMCGgUAMGkGCisGAQQB
# gjcCAQSgWzBZMDQGCisGAQQBgjcCAR4wJgIDAQAABBAfzDtgWUsITrck0sYpfvNR
# AgEAAgEAAgEAAgEAAgEAMCEwCQYFKw4DAhoFAAQU3DiWc0/Em1jWFkZrblBWk/SS
# 9AagggIHMIICAzCCAWygAwIBAgIQNLlZ1S66VKZIK0GrwNPiZjANBgkqhkiG9w0B
# AQUFADAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbTAeFw0xNzA4MTMwMTQ3
# MDdaFw0yMTA4MTMwMDAwMDBaMBwxGjAYBgNVBAMMEXNpZ25hbHdhcnJhbnQuY29t
# MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCzRJtZQFTGFuO1em//vTUA9P1D
# KxGzFKXh1smpLVMVc4kIH8IcZduB4g/Zfd3bfk2qPudoaNCz+BWtkISMvKCRsEj6
# wEOzXoEvCJKusEmIH8S9YBiY70uoFSvwn/HR3BoItGPotnGtk69Uc7Ldvm7NQjRL
# z3OJp7xbj5bIkhuDmQIDAQABo0YwRDATBgNVHSUEDDAKBggrBgEFBQcDAzAdBgNV
# HQ4EFgQU9TJKFdTrOuigTVLwltUWgnE6tMowDgYDVR0PAQH/BAQDAgeAMA0GCSqG
# SIb3DQEBBQUAA4GBAIP0zowgc+EYPB5BeVm+L0jkqfiqEvQgSIdeSeYXSe6tLGZ+
# rtOlp6XJ+xBSWnpIl7oftc13zDY5+j/++WBuY1y9aM48zzhUxnfaou48u+wXpwMs
# FkhPouje4qfdF7dJzM+4SeA0rNPbG+7jEqYxAmBOS0U67vFK5ISDntdxQTp6MYIB
# UDCCAUwCAQEwMDAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbQIQNLlZ1S66
# VKZIK0GrwNPiZjAJBgUrDgMCGgUAoHgwGAYKKwYBBAGCNwIBDDEKMAigAoAAoQKA
# ADAZBgkqhkiG9w0BCQMxDAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYK
# KwYBBAGCNwIBFTAjBgkqhkiG9w0BCQQxFgQUJly3aMFm5BHJwQzwpMuAiMZ5j2Mw
# DQYJKoZIhvcNAQEBBQAEgYCoHZCqifQlc527VIUSR8mUjC11Xd2nROtSJcyO98Dh
# Ps2AnT5lpHevegUzKusL1b04eIiqo2kRKiMHS4YTAQfnAZWB4ac+dWcxc5i8ZWSz
# /EWXcynMMZibcFewIsvp+IVZ1tUjJSqGcjy/UN5Xn52BtoxnJJuK4Dvgy1oRg/E7
# bg==
# SIG # End signature block

 

This script will merge all PDFs in a given folder into 1 PDF. You would think it would take a long time for many PDF files but I created a 20,000 page PDF file in a second or so.

Function Merge-PDFs{
  <#
    .SYNOPSIS
    Merges PDF files into 1 PDF file.

    .PARAMETER filePath
    -filePath: Any PDF file in the filepath will be combined into 1 PDF file named All_PowerShell_Help.pdf.

    .PARAMETER dllPath
    -dllPath: The Path to the iTextSharp.DLL file

    .EXAMPLE
    Merge-PDFs -filePath 'c:\scripts\help' -dllPath 'c:\scripts\itextsharp.dll'
    Describe what this call does

    .NOTES
    Modified Code from here
    http://geekswithblogs.net/burncsharp/archive/2007/04/13/111629.aspx

  #>


  Param(
  [Parameter(
    Mandatory=$True,
    HelpMessage='Add the path you want to save help to EX. c:\help'
  )][string]$filePath,
  
  [Parameter(
    ValueFromPipelinebyPropertyName=$true
    )][string]$dllPath
  )

  Begin{
    Try{
      Add-Type -Path $dllPath -ErrorAction Stop
    }
    Catch{
      Throw "Could not load iTextSharp DLL from $($dllPath).`nPlease check that the dll is located at that path."
    }
  }

  Process{ 

##############################################################
# Merges all the PDF files for each Module in to 1 PDF file 
# called All_PowerShell_Help.pdf in $filepath
# 21,000 pages to maybe a second, iTextSharp is fast.
##############################################################

    $pdfs = Get-ChildItem -Path $filePath -Recurse -Filter '*.pdf'
    $ErrorActionPreference = 'silentlycontinue'
    [void] [System.Reflection.Assembly]::LoadFrom(
      [System.IO.Path]::Combine($filePath, $dllPath)
    )
    $output = [System.IO.Path]::Combine($filePath, 'All_PowerShell_Help.pdf')
    $fileStream = New-Object -TypeName System.IO.FileStream -ArgumentList ($output,[System.IO.FileMode]::OpenOrCreate)
    $document = New-Object -TypeName iTextSharp.text.Document
    $pdfCopy = New-Object -TypeName iTextSharp.text.pdf.PdfCopy -ArgumentList ($document, $fileStream)
    $document.Open()
    
    foreach ($pdf in $pdfs) {
        $reader = New-Object -TypeName iTextSharp.text.pdf.PdfReader -ArgumentList ($pdf.FullName)
        $pdfCopy.AddDocument($reader)
        $reader.Dispose()
    }
$pdfCopy.Dispose()
$document.Dispose()
$fileStream.Dispose()

#############################################################
# Find all directories in $filepath that are empty and delete
# them. Some directories created for DSC Resources will be empty
#############################################################
    (Get-ChildItem -Path $filePath -Recurse | 
    Where-Object {$_.PSIsContainer -eq $True}) | 
    Where-Object {$_.GetFiles().Count -eq 0} | 
    Remove-Item
}

}
# SIG # Begin signature block
# MIID7QYJKoZIhvcNAQcCoIID3jCCA9oCAQExCzAJBgUrDgMCGgUAMGkGCisGAQQB
# gjcCAQSgWzBZMDQGCisGAQQBgjcCAR4wJgIDAQAABBAfzDtgWUsITrck0sYpfvNR
# AgEAAgEAAgEAAgEAAgEAMCEwCQYFKw4DAhoFAAQUAUcWhwkBV0QYSdjyamQBs0uq
# 4RigggIHMIICAzCCAWygAwIBAgIQNLlZ1S66VKZIK0GrwNPiZjANBgkqhkiG9w0B
# AQUFADAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbTAeFw0xNzA4MTMwMTQ3
# MDdaFw0yMTA4MTMwMDAwMDBaMBwxGjAYBgNVBAMMEXNpZ25hbHdhcnJhbnQuY29t
# MIGfMA0GCSqGSIb3DQEBAQUAA4GNADCBiQKBgQCzRJtZQFTGFuO1em//vTUA9P1D
# KxGzFKXh1smpLVMVc4kIH8IcZduB4g/Zfd3bfk2qPudoaNCz+BWtkISMvKCRsEj6
# wEOzXoEvCJKusEmIH8S9YBiY70uoFSvwn/HR3BoItGPotnGtk69Uc7Ldvm7NQjRL
# z3OJp7xbj5bIkhuDmQIDAQABo0YwRDATBgNVHSUEDDAKBggrBgEFBQcDAzAdBgNV
# HQ4EFgQU9TJKFdTrOuigTVLwltUWgnE6tMowDgYDVR0PAQH/BAQDAgeAMA0GCSqG
# SIb3DQEBBQUAA4GBAIP0zowgc+EYPB5BeVm+L0jkqfiqEvQgSIdeSeYXSe6tLGZ+
# rtOlp6XJ+xBSWnpIl7oftc13zDY5+j/++WBuY1y9aM48zzhUxnfaou48u+wXpwMs
# FkhPouje4qfdF7dJzM+4SeA0rNPbG+7jEqYxAmBOS0U67vFK5ISDntdxQTp6MYIB
# UDCCAUwCAQEwMDAcMRowGAYDVQQDDBFzaWduYWx3YXJyYW50LmNvbQIQNLlZ1S66
# VKZIK0GrwNPiZjAJBgUrDgMCGgUAoHgwGAYKKwYBBAGCNwIBDDEKMAigAoAAoQKA
# ADAZBgkqhkiG9w0BCQMxDAYKKwYBBAGCNwIBBDAcBgorBgEEAYI3AgELMQ4wDAYK
# KwYBBAGCNwIBFTAjBgkqhkiG9w0BCQQxFgQU2AI6PsOR1hE2VXson/ySl+IOHsMw
# DQYJKoZIhvcNAQEBBQAEgYCIFpTuPizoQTJX/LtbyNnYEfmvQQqJucaxrxD9dtUq
# laKhbCVRAyjqxdfaWmYpj4WXdbas0tfuPJtZvwN63yvGNrh7iNKOKNJaYgjb8iOu
# t/4FLTPlz+4+tzWLj3BvifDrYQLR4ZUc+U0K0ZdDRWzFPmmwZcUFaGNDXWpgamJB
# JA==
# SIG # End signature block

 

This script is not part of any of the functions above but I figured I would include it. It loops through each folder recursively and merges all the PDF files in each folder.

##############################################################
# Merges all the PDF files for each Module in to 1 PDF file per
# module called Help_<moduleName>.pdf in $filepath
##############################################################

$folders = Get-ChildItem -Path $filePath -Directory
$ErrorActionPreference = 'silentlycontinue'
foreach ($folder in $folders){
    $pdfs = Get-ChildItem -Path $folder.fullname -recurse -Filter '*.pdf'

    [void] [System.Reflection.Assembly]::LoadFrom(
        [System.IO.Path]::Combine($filePath, $dllPath)
    )
        $output = [System.IO.Path]::Combine($filePath, "Help_$($folder[0].Name).pdf")
        $fileStream = New-Object -TypeName System.IO.FileStream -ArgumentList ($output, [System.IO.FileMode]::OpenOrCreate)
        $document = New-Object -TypeName iTextSharp.text.Document
        $pdfCopy = New-Object -TypeName iTextSharp.text.pdf.PdfCopy -ArgumentList ($document, $fileStream)
        $document.Open()
    
        foreach ($pdf in $pdfs) {
            $reader = New-Object -TypeName iTextSharp.text.pdf.PdfReader -ArgumentList ($pdf.FullName)
            $pdfCopy.AddDocument($reader)
            $reader.Dispose()
        }
    $pdfCopy.Dispose()
    $document.Dispose()
    $fileStream.Dispose()
}

 

]]>
http://www.signalwarrant.com/convert-to-and-merge-pdfs-with-itextsharp-and-powershell/feed/ 0 1038
Automate Creating Lab Virtual Machines in Azure with PowerShell http://www.signalwarrant.com/automate-creating-lab-virtual-machines-in-azure-with-powershell/ http://www.signalwarrant.com/automate-creating-lab-virtual-machines-in-azure-with-powershell/#respond Wed, 12 Jul 2017 23:20:24 +0000 http://www.signalwarrant.com/?p=1012 As you may or may not know, I recently decommissioned my old Dell PowerEdge 1950 server that I used for a few Lab virtual machines. While experimenting with PowerShell on these Virtual Machines, I have found myself in the situation where it would be easier to delete the Virtual Machines and re-create them instead of troubleshooting something I fouled up in the Registry. After the 2nd time rebuilding the lab VMs using the Azure website, I decided to script the process.

The script below will take input from a CSV file and create a virtual machine in your Azure subscription for each row in the CSV file. My example creates 2 virtual machines but you can obviously add as many as you need.

For a production environment in Azure, I would suggest Snapshotting the Virtual Machines. There is a good write-up of the process here: http://www.coreazure.com/snapshot-vms-in-azure-resource-manager/. In my case, for a Lab, snapshots use more storage which costs more $$.

 

Function New-AzureLab {
  <#
    .SYNOPSIS
    New-AzureLab will create 1 or multiple VMs in Azure based on input parameters from a CSV

    .DESCRIPTION
    Create a CSV file like below:
    VMName,Location,InterfaceName,ResourceGroupName,VMSize,ComputerName
    SP,EastUS,SP_Int,SignalWarrant_RG,Basic_A2,SP

    The function will read the input from the CSV file and create VMs in an Azure Resource Group
    
    .PARAMETER csvpath
    The full path to your CSV file (eg c:\scripts\VMs.csv)

    .EXAMPLE
    New-AzureLab -csvpath c:\scripts\VMs.csv
    Imports the applicable values from the CSV file

    .NOTES
    1. I already had a Resource Group in Azure therefore I put all the VMs in the same group.
    2. I already had a VM network created, all my VMs are in the same network.

    .LINK
    URLs to related sites
    A good writeup on the process - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/quick-create-powershell
    Azure VM size values - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/sizes-general
    Azure VM Publisher, Offer, SKUs, Version info for various VM types - https://docs.microsoft.com/en-us/azure/virtual-machines/windows/cli-ps-findimage

    .INPUTS
    CSV file path

    .OUTPUTS
    None

    .EXAMPLES
    New-AzureLab -csvpath c:\scripts\VMs.csv
  #>

    Param(
        [Parameter(Mandatory=$True,HelpMessage='Enter the Path to your CSV')]
        [string]$csvpath
        )
    # Lets make sure the CSV file is actually there
    $testpath = Test-Path -Path $csvpath
    If (!$testpath){
        clear-host
        write-host -ForegroundColor Red '***** Invalid CSV Path *****' -ErrorAction Stop
    } else {

        # This will the be local username and password for each VM
        $Credential = Get-Credential

        # Import the information from my CSV
        Import-Csv -Path "$csvPath" | ForEach-Object {
        
        # Get the Storage Account Informaiton
        $StorageAccount = Get-AzureRmStorageAccount

        # This is the naming convention for the OS Disk
        $OSDiskName = $_.'VMName' + '_OSDisk'

        # Network Information
        $PublicIP = New-AzureRmPublicIpAddress -Name $_.'InterfaceName' -ResourceGroupName $_.'ResourceGroupName' -Location $_.'Location' -AllocationMethod Dynamic
        $VMNetwork = Get-AzureRmVirtualNetwork
        $Interface = New-AzureRmNetworkInterface -Name $_.'InterfaceName' -ResourceGroupName $_.'ResourceGroupName' -Location $_.'Location' -SubnetId $VMNetwork.Subnets[0].Id -PublicIpAddressId $PublicIP.Id

        ## Setup local VM object
        $VirtualMachine = New-AzureRmVMConfig -VMName $_.'VMName' -VMSize $_.'VMSize'
        $VirtualMachine = Set-AzureRmVMOperatingSystem -VM $VirtualMachine -Windows -ComputerName $_.'ComputerName' -Credential $Credential -ProvisionVMAgent -EnableAutoUpdate
        $VirtualMachine = Set-AzureRmVMSourceImage -VM $VirtualMachine -PublisherName MicrosoftWindowsServer -Offer WindowsServer -Skus 2016-Datacenter -Version 'latest'
        $VirtualMachine = Add-AzureRmVMNetworkInterface -VM $VirtualMachine -Id $Interface.Id
        $OSDiskUri = $StorageAccount.PrimaryEndpoints.Blob.ToString() + 'vhds/' + $OSDiskName + '.vhd'
        $VirtualMachine = Set-AzureRmVMOSDisk -VM $VirtualMachine -Name $OSDiskName -VhdUri $OSDiskUri -CreateOption FromImage

        ## Create the VM in Azure
        New-AzureRmVM -ResourceGroupName $_.'ResourceGroupName' -Location $_.'Location' -VM $VirtualMachine -Verbose

        }
    }
}

 

]]>
http://www.signalwarrant.com/automate-creating-lab-virtual-machines-in-azure-with-powershell/feed/ 0 1012
Hey PowerShell… Text me if my Domain Admins Group changes http://www.signalwarrant.com/hey-powershell-text-me-if-my-domain-admins-group-changes/ http://www.signalwarrant.com/hey-powershell-text-me-if-my-domain-admins-group-changes/#comments Thu, 29 Jun 2017 00:06:29 +0000 http://www.signalwarrant.com/?p=989 This is why I Love PowerShell… It’s simple, yet functional.

From an Administrative perspective, I think we can all agree that any changes in your Domain Admins group without your knowledge would be of interest to you. If you’re in a large organization with access to enterprise management tools you probably have some widget that fires off a message to you or a group of people in the event a change is detected… or maybe you don’t.

If you’re an admin at a small business and maybe even some medium sized businesses, you may not have access to those enterprise management tools and widgets. Turns out, we can use PowerShell to monitor any group for us and notify us when a change occurs. It’s actually pretty simple.

You can even have PowerShell send you a text message… which is pretty cool.

I’m using the script to keep an eye on my Domain Admins Group but you could easily adapt it to monitor services or processes. You might want to monitor your Exchange Servers Transport service, if it stops for whatever reason send me an email and text message.

First, we have to get all the members of the Domain Admins Group and export to an xml file.

# Run this once to get the Domain Admins group baseline
Get-ADGroupMember -Server signalwarrant.local -Identity "Domain Admins" |
    Select-Object -ExpandProperty samaccountname | 
    Export-Clixml -Path 'C:\scripts\CurrentDomainAdmins.xml'

This is the script we’ll run on a schedule.

# This is the script we'll run on a regular basis

# Get the filehash of the CurrentDomainAdmins.xml
    $CurrentAdminsHash = Get-FileHash -Path 'C:\scripts\CurrentDomainAdmins.xml' | 
      Select-Object -expandProperty Hash
# Get the current date
    $Date = Get-Date
# This is the file we're testing the CurrentDomainAdmins.xml file against
    $newAdmins = 'c:\scripts\NewAdmins.xml'
# A variable we will use in the if statement below
    $Change = ''

# As we run the test we're going to get the contents of the Domain Admins Group
Get-ADGroupMember -Server signalwarrant.local -Identity 'Domain Admins' |
    Select-Object -ExpandProperty samaccountname | 
    Export-Clixml -Path $newAdmins -Force

# Get the filehash of the new file 
$NewAdminsHash = Get-FileHash -Path $newAdmins | Select-Object -expandProperty Hash

# If the CurrentDomainAdmins.xml (our baseline file) and NewAdmins.xml do not match
If ($NewAdminsHash -ne $CurrentAdminsHash){
    
    # Do all of this if a change is detected
    $Change = 'Yes'
    $ChangesDetected = 'Domain Admins Group changed detected on: ' + $date
    $ChangesDetected | Out-File -FilePath 'C:\scripts\DA_Changes.txt' -Append -Force
} else {

    # If no change detected just write when the script last ran
    $Change = 'No'
    $NoChangesDetected = 'No Changes detected on: ' + $Date
    $NoChangesdetected | Out-File -FilePath 'C:\scripts\DA_NoChanges.txt' -Append -Force
}

# Credentials for the email account
# Do not store cleartext passwords in scripts
# https://powershell.org/forums/topic/powershell-specifiy-a-literal-encrypted-standard-string/
# The above link will tell you why I had to do it.

# If your Email account is on the same domain as the machine you're running the script from
# I would suggest using this function to create your encrypted Password file.
# https://gist.github.com/davefunkel/415a4a09165b8a6027a297085bf812c5
$username = 'your email here'
$password = 'password for the above email address'
$secureStringPwd = $password | ConvertTo-SecureString -AsPlainText -Force 
$creds = New-Object System.Management.Automation.PSCredential -ArgumentList $username, $secureStringPwd

# If the test above fails and the $change = "yes" then send me an email and text message
# and attach the NewAdmins.xml
If ($Change -eq 'Yes') {
    # Code to send the email and lof the message sent in the EventLog
    $From = 'your email here'
    $To = 'your email here'
    $Cc = 'your email here'
    $Attachment = $newAdmins
    $Subject = '----Domain Admin Members has changed----'
    $Body = 'Your awesome PowerShell script has detected a change in your Domain Admin members'
    $SMTPServer = 'your smtp server address'
    $SMTPPort = '587'
    Send-MailMessage -From $From -to $To -Cc $Cc -Subject $Subject `
    -Body $Body -SmtpServer $SMTPServer -port $SMTPPort `
    -Credential $creds -Attachments $Attachment
}

These are the Action arguments for the scheduled task.
-NoLogo -NonInteractive -WindowStyle Hidden -NoProfile -Executionpolicy bypass -file “C:\scripts\AD_Audit.ps1”

 

]]>
http://www.signalwarrant.com/hey-powershell-text-me-if-my-domain-admins-group-changes/feed/ 7 989
Visual Studio Code and Visual Studio Team Services Integration http://www.signalwarrant.com/visual-studio-code-and-visual-studio-team-services-integration/ http://www.signalwarrant.com/visual-studio-code-and-visual-studio-team-services-integration/#respond Thu, 15 Jun 2017 00:16:16 +0000 http://www.signalwarrant.com/?p=979 For whatever reason, I seem to be migrating most of my coding activity to Visual Studio Code from the PowerShell ISE. I’m not really sure why other than the look and feel seems to be more pleasing than the ISE… to me anyway. Anywho, I have wanted to hack out some sort of code backup and version control solution for all of my code for a while now. Moving everything to Visual Studio Code seems like the prime time to get that done as well.

I know nothing about Git or Visual Studio Team services, I have zero experience with either. So, In order to get this setup, I found myself searching the Youtube for a tutorial… I didn’t find much. After a couple of hours googling and youtube video watching I finally got it to work. In order to save you the time and effort, I thought it would be good to make a video myself.

]]>
http://www.signalwarrant.com/visual-studio-code-and-visual-studio-team-services-integration/feed/ 0 979
Enable PowerShell Remoting to an Azure Virtual Machine, without Domain Membership http://www.signalwarrant.com/enable-powershell-remoting-to-an-azure-virtual-machine-without-domain-membership/ http://www.signalwarrant.com/enable-powershell-remoting-to-an-azure-virtual-machine-without-domain-membership/#respond Wed, 24 May 2017 14:24:10 +0000 http://www.signalwarrant.com/?p=964 I recently started an Azure subscription in order to move all the servers I use to test PowerShell code to the cloud. Right now I have only a couple Virtual Machines, one running Windows Server 2016, that’s my Domain Controller. I also have a Windows Server 2012 R2 Virtual Machine with Exchange 2013 installed. Obviously, both of these VMs are in the same domain.

For the purposes of testing, I wanted to be able to remote to the cloud VMs using PowerShell. The problem is since my local machine is not in the same domain as the VMs I couldn’t get authenticated. Now, you can stand up an Azure Active Directory and put the local machine in that domain and you’re good to go. I’m trying to keep costs as low as possible so I wasn’t willing to pay that extra expense for the Azure AD. I think you can also use a certificate in an Azure Keystore but again, extra expense plus I would have to figure out how to make it work… I’m an Azure n00b.

After some quality time consulting Professor Google, I came to the conclusion to create a certificate in each VM, then importing that certificate on my local laptop was the easiest way to make this work. Obviously, this is not a good enterprise solution although I guess you could probably do it a little more efficiently on a larger scale using Certificate Services. Anywho… this is how I did it.

If you have a better method, please let me know in the comments.

 

 

# Enable Remoting to an Azure VM
Enable-PSRemoting

# Make sure to set the Public IP address to static or make sure you track the change of the public IP

# Create Network Security Group Rule to allow winrm

# Create a Selfsigned cert on the Azure VM
$Cert = New-SelfSignedCertificate -CertstoreLocation Cert:\LocalMachine\My -DnsName PC1.mydomain.local
Export-Certificate -Cert $Cert -FilePath '<filepath>\exch.cer'

# Create a firewall rule inside the Azure VM 
New-Item -Path WSMan:\LocalHost\Listener -Transport HTTPS -Address * -CertificateThumbPrint $Cert.Thumbprint -Force
New-NetFirewallRule -DisplayName 'WinRM HTTPS-In' -Name 'WinRM HTTPS-In' -Profile Any -LocalPort 5986 -Protocol TCP

# Install the Cert on the client

# Run this on the remote client
$cred = Get-Credential
Enter-PSSession -ConnectionUri https://xx.xx.xx.xx:5986 -Credential $cred -SessionOption `
(New-PSSessionOption -SkipCACheck -SkipCNCheck -SkipRevocationCheck) -Authentication Negotiate

 

 

]]>
http://www.signalwarrant.com/enable-powershell-remoting-to-an-azure-virtual-machine-without-domain-membership/feed/ 0 964