All posts by Alberto Ortega

Sharepoint 2010 – Change SAML Token Lifetime

Yesterday I went trough and interesting analysis with Matias about how is the best way to tweak the SAML Token Lifetime for Sharepoint 2010 web applications using ADFS as a Claims Auth provider.

We have basically three cookies to worry about in this scenario. The Authentication cookie, the Account partner cookie and the SharePoint cookie.  The Account partner cookie is the one that bypasses the home realm discovery page when we hit ADFS and is not involved in this scenario.

The Authentication cookie has two associated lifetimes, the SSOLifeTime and the TokenLifetime for an specific Relying Party. You can change the TokenLifetime using the following powershell script.

Add-PSSnapin Microsoft.ADFS.Powershell

Set-ADFSRelyingPartyTrust -TargetName “Relying Party Common Name” -TokenLifeTime 15

On the Sharepoint side there is also a configuration that needs to be considered which is the LogonTokenCacheExpirationWindow, that value  needs to be understood as a time windows that Sharepoint considers before the SAML token will expire to renew the token. The LogonTokenCacheExpirationWindow needs always to be much less than the TokenLifeTime if both values are similar You basically go back and forth until ADFS stops and gives you the error message “The same client browser session has made ‘6’ requests in the last ‘12’ seconds.”. This is because as soon as Sharepoint received the SAML token for ADFS it knows that the cookie was good for less time than the  LogonTokenCacheExpirationWindow so it went back to ADFS to authenticate again.

We tried different values for this setting and we think that 1 second is enough for that time windows. Making this windows as small as possible will push the management to ADFS.

$sts = Get-SPSecurityTokenServiceConfig

$sts.LogonTokenCacheExpirationWindow = (New-TimeSpan –minutes 1)

$sts.Update()

iisreset

Reference: http://blogs.technet.com/b/speschka/archive/2010/08/09/setting-the-login-token-expiration-correctly-for-sharepoint-2010-saml-claims-users.aspx we found this post very useful it will give you a deeper look at the problem.

Sharepoint 2010 and ADFS – Sign in as a different user

It is up to the application to do a proper federated sign-out and Sharepoint 2010 OOB is not doing this in the correct way. If you take a look at the HTTP conversation with fiddler Sharepoint will not call the wa=wsignout1.0 action on ADFS, it will simply clean the current authentication cookie.

How to configure Sharepoint to call the wa=wsignout1.0 action?

  1. You need to modify the Welcome.ascx page on C:Program FilesCommon FilesMicrosoft SharedWeb Server Extensions14TEMPLATECONTROLTEMPLATES on every front-end server that hosts this web application.
  2. Find the section
    id="ID_Logout2"
  3. Modify that section to id=”ID_Logout2” and then add the ClientOnClickNavigateUrl attribute to point to your ADFS
    clip_image002
  4. Now when you click the Sign-Out button on Sharepoint 2010 you will be redirected to ADFS,
    clip_image003

Identified Caveats

  1. Kerberos ticket will not expire: Although you logged-out from ADFS you will not be able to login as a different domain user unless you close and re-open your browser to expire the Kerberos ticket.
  2. Sign-out will not redirect back to Sharepoint: According the WS-Federation protocol specification (http://msdn.microsoft.com/en-us/library/bb608217.aspx) appending the &wreply=encoded_URL to the query string will do the trick to redirect back to the current Sharepoint page. However with simple tests I was not able to do the trick and we need to go deeper on this.
  3. Impact of the change: Replacing the entire Welcome.ascx file will need to be included as a post-configuration for the SharePoint deployment.

Reference : http://www.shailen.sukul.org/2010/05/adfs-2-sharepoint-2010-signout.html, here everything is figured out, however I will need to go deeper to tackle the identified caveats.

Ahora sí­ , podes estar en las nubes [Material]

Ayer tuvimos una tremenda conferencia técnica con mucha gente que se acercó a discutir el tema de la nube. Hay algo de material que surgió del evento y agrego en este post para quienes quieran un punto de partida para crear su nube privada con SCVMM.

Pueden descargar la presentación de la conferencia acá http://cid-5f9c7b75bd402dda.skydrive.live.com/self.aspx/Pub/Conf%5E_Cloud%5E_20100518/Conferencia%20Tecnica%20-%20Cloud.pdf para ir poniendose en contexto.

Como repaso (muy por arriba) lo que hicimos en la parte de nube privada de la demostración fue,

  1. Pre-Setup: SCVMM instalado y funcionando integrado con SCOM y el Self-Service Portal.
  2. Pre-Setup: Crear un VHD generalizado ví­a sysprep.exe y copiarlo a la biblioteca de SCVMM.
  3. IT Pro: Crear un Virtual Machine Template
    1. Partir del VHD generalizado.
    2. En el perfil de Hardware garantizar conectividad.
    3. En el perfil de Software unir al dominio, usar un archivo desatendido y correr como GUIRunOnce un script que obtiene la IP de la VM que se levantó y lo copia a un File Share donde un servicio corriendo en TMG lo levanta y crea una regla en el perímetro para que el RDP esté disponible desde la WAN.El archivo de respuestas que usamos fue,
      <?xml version=”1.0″ encoding=”utf-8″?>

      <unattend xmlns=”urn:schemas-microsoft-com:unattend”>

      <settings pass=”oobeSystem”>

      <component name=”Microsoft-Windows-Shell-Setup” processorArchitecture=”amd64″ publicKeyToken=”31bf3856ad364e35″ language=”neutral” versionScope=”nonSxS” xmlns:wcm=”http://schemas.microsoft.com/WMIConfig/2002/State”>

      <Display>

      <ColorDepth>16</ColorDepth>
      <HorizontalResolution>1024</HorizontalResolution>
      <RefreshRate>60</RefreshRate>
      <VerticalResolution>768</VerticalResolution>

      </Display>

      <RegisteredOrganization>TechNET</RegisteredOrganization>

      <OOBE>

      <HideEULAPage>true</HideEULAPage>
      <NetworkLocation>Work</NetworkLocation>
      <ProtectYourPC>1</ProtectYourPC>
      <SkipMachineOOBE>true</SkipMachineOOBE>
      <SkipUserOOBE>true</SkipUserOOBE>

      </OOBE>

      <AutoLogon>

      <Password>

      <Value>Pas*********</Value>

      </Password>
      <Domain>TECHNET</Domain>
      <Enabled>true</Enabled>
      <LogonCount>1</LogonCount>
      <Username>Administrator</Username>

      </AutoLogon>

      </component>

      </settings>

      </unattend>

El script con el que obtuvimos la IP, ejecutado en el GUIRunOnce es (si, hay muchas mejoras por hacerle)
DIM objShell

set objShell = wscript.createObject(“wscript.shell”)

iReturn = objShell.Run(“CMD /C ipconfig > \TN-NOD2VHDsrctmp”, , True)

iReturn = objShell.Run(“powershell “”&{$match = Select-String -Path \TN-NOD2VHDsrctmp -Pattern 192.168.200.? | Out-String; $match.substring(64,15)}”” > ip.txt”, ,True)

El script que crea la regla en TMG a partir del script anterior reclámenselo a su autor en http://blogs.prisma.cc/blogs/leandro/

  1. ITPro: Editar el Virtual Machine Template recién creado y editarle el Cost Center.
  2. ITPro: Crear un grupo del tipo Self-Service Users, agregar a los usuarios del dominio que van a poder provisionar ese template y agregar el template al grupo para que les aparezca por el portal cuando ingresen. Configurar la cuota en base a la cantidad de hardware que tenga.
  3. End-User: Acceder al Self-Service Portal y provisionar una nueva máquina virtual.
  4. ITPro: Correr el reporte desde SCOM “Virtual Machine Allocation” para ver cuando hardware le fue alocado a cada centro de costos.

Por último, estas son las referencias que agregamos al final de la presentación para continuar profundizando,

[SCVMM Powershell] Update the Cost Center property of every managed Virtual Machine

I wanted to begin using VMM allocation reports properly and I needed to update every Virtual Machine “Cost Center” attribute. We currently have more than 120 VMs on our Lab so I decided to write a powershell script to automate this task. The “Cost Center” y a so called Custom Property of the Virtual Machine object that you can access though the CustomProperties array $_.CustomProperties[0]

The script is simply and goes to the point,

#Connect to SCVMM remotely.

add-pssnapin Microsoft.SystemCenter.VirtualMachineManager

$scvmm = get-vmmserver TN-VMM1

#Get every VM

$vms = Get-VM -VMMServer TN-VMM1

#Set the cost center

$vms | ForEach-Object {$_.CustomProperties[0]=’Produccion’}

#Show a couple off VMs custom properties.

$vms | select Name, Status, @{Name=’CostCenter';Expression={$_.CustomProperties[0]}}, @{Name=’LastUpdated';Expression={$_.CustomProperties[1]}}

Ahora sí­, podes estar en las nubes (Conferencia Técnica)

Próximo Martes 18 de Mayo a las 18:30 Hs en Microsoft Argentina (Bouchard 710 4to Piso) vamos a ahondar junto con el IT-Monster Leandro Amore en el estado del arte de las nubes haciendo un trabajo fino sobre nubes privadas.

Registrense aca! https://msevents.microsoft.com/CUI/EventDetail.aspx?EventID=1032450458&culture=es-AR

Tenemos planeada una conferencia de ~2Horas tocando los siguientes temas:

  1. Intro: El estado del arte de las nubes, donde estamos parados: PaaS, IaaS y SaaS.
  2. Demo Nubes Privadas: Hardware preparado para la nube y con un tremendo escenario tuneando System Center Virtual Machine Manager sobre HyperV para que nos haga de nube privada como dios manda.
  3. Demo Nubes Públicas: Recorremos la oferta actual de Microsoft Online Services (Forefront, Sharepoint, Communicator) y Windows Azure.

Esto es lo que tenemos por ahora, puede cambiar todavía tenemos una semana para seguir preparando los ambientes.

Nos vemos!

Sharepoint 2010 Inbound and Outbound Email Configuration (AD Integrated) without Exchange

Exchange deployment is not necessary to take profit of Sharepoint email capabilities. Transform a email to a given address into an item on a list is a powerful feature to empower your Information worker collaboration capabilities.

Incoming and outcoming email configuration It is finally working on my Sharepoint Server. Now you can configure your Email enabled list going to the “List Settings” configuration > Communications tab. You will be able to map any mail following the pattern mylist@mydomain.com to your Sharepoint list.

Configuration Steps

  1. Configure an IIS SMTP Server on the Sharepoint Server as an open relay server.
    1. For Windows Server 2008 R2, add the SMTP feature,

    2. Open your IIS 6.0 manager (make sure you installed the service as part of the Web Server Role).
    3. Configure the default SMTP Virtual Server #1 as desired, nothing special here.
    4. Add your domain as an alias domain and this is it.
  2. Point MX records for the zone mydomain.com to the SMTP server.
  3. Create an OU on AD dedicated to Sharepoint contacts and delegate permissions to the Sharepoint computer account on the OU. We used the NETWORK_SERVICE account because the Central Admin Website is running under the Network Service account.
    1. Read the sections “To delegate Create all Child Objects and Delete all Child Objects control of the OU to the application pool identity account for Central Administration” and “To add Delete Subtree permissions for the application pool identity account for Central Administration” on http://technet.microsoft.com/en-us/library/cc287879(office.14).aspx

  4. (Weird :S) Extend the AD schema with Exchange 2003 forestprep. Sharepoint is able to work without exchange, however for AD integration the schema has to be prepared.
  5. Configure the Inbound and Outbound Email settings on the “System Settings” section of the Sharepoint Central Admin.
  6. Create a discussion list, then configure settings for this discussion list.
  7. Sent email to test@mydomain.com and watch how the item is created.

Resources

- MX Test:Â http://www.mxtoolbox.com/SuperTool.aspx?action=mx:swfwd.com
- Sharepoint 2010 :Â http://technet.microsoft.com/en-us/library/cc287879(office.14).aspx


Lanzamiento Virtual “La eficiencia en tus manos”

Hola a todos, quiero recordarles que este 21 de Abril será el evento virtual de lanzamiento de 11 importantes productos y donde muchos MVPs estaran como speakers y como expertos para resolver tus dudas.



[Review] GFI Max Mail Protection

Summary

Cloud computing, Infrastructure and Platform as a Service is becoming a key business enabler for the CIO of any mid-sized company. Why are hosted services a key enabler for small IT departments? Easy, inherent complexity in a growing business makes it very hard to keep a great level of service constantly while the months pass by.

Hosted solutions look like a charm when they are well implemented, and this is why I took a look at GFI Max MailProtection, because as a consultant I need know how these services are performing. I was very impressed by GFI Max MailProtection, I hope you enjoy this review and if you are looking for a hosted solution for your email sanitization you should definitely take a look at this great service from the people at GFI.

Review

As a technical leader and consultant I am in constant touch with my customers and need to stay current with management and security solutions. So, I gave GFI’s mail security hosted solution a try and found a very professional service run by these people, keep on reading to find out more!

Getting a Free Trial

Your journey begins with a free 30 day trial, this process was simpler than I thought, not much information is gathered in the two step form and you are not “invited” to enter your credit card information, this means that if you are simply curious as I am you will find that it is easy to get going with the service.

The terms and conditions proposed by GFI are fair enough knowing what other vendors offer, on the SERVICE LEVEL COMMITMENT AND EXCLUSIVE REMEDY section they state “if GFI determines that the GFI Systems were unavailable for twenty or more consecutive minutes during a calendar month, GFI, upon Customer’s request within five days of such event, will credit Customer’s account the pro-rated charges for one (1) day of the contracted GFI Service(s)” you may think twenty or more minutes worth more than 1 day of contracted service, however it is comparable to the Google Apps Premier conditions or Amazon Web Services to name a few mission critical hosted services. I am used to this kind of agreement and I learned that you need to use the service to see how it works when you need to move a mid-sized business forward and you do not have a legal department to back you up on a possible legal prosecution.

Then, the CONFIDENTIAL INFORMATION section states “IT IS UNDERSTOOD THAT IN ALL CASES, CLIENT’S EMAIL SHALL BE DEEMED AS CONFIDENTIAL INFORMATION” however they NEED to access the information to guarantee sanitization, that’s a fact; you cannot cure what you cannot touch. From my perspective there is a good understanding of the intellectual property as an asset and they take care of making it explicit on this agreement.

After a few seconds my free trial was provisioned and I got my login information on my inbox. Sweet!

Initial Configuration

Well, it took me 5 minutes to go through the basic initial 4-step configuration:

  1. Set up a DOMAIN in the control panel.
  2. Set up the USERS (email addresses) for the domain.
  3. Configure INBOUND FILTERING.
  4. Change the DNS “MX” records for your domain

Setting up my domain I found a great multi-tenancy approach to ease the administration of multiple companies. Every action you take is in the context of an organization,

Creating users is easy, your end-users will need to access the GFI console to see how their SPAM is been handled for example.

Configuring Inbound filtering has a handy slider-like control for your aggressive posture, the default configuration is highly aggressive which sounds good to me.

If you like Grey-listing (which is not my case) you have the chance to train your tool to challenge your unknown sender email server, although you must be willing to accept delays on your email delivery which may not be acceptable for your CXO end-users.

Finally, I changed my MX records to point to the GFI servers. It took a couple of hours for my email to begin flowing though GFI MAX MailProtection[GBO1] .

I had nothing to do next, so, in the meantime, I took a couple of minutes to take a look at the reporting capabilities and I found a neat report which enables serious threat analysis to support most of any security process needs. Summarized by days you will begin having numbers and knowing your users taxonomy, for example which sector has the worst received/infected ratio to take preventive actions.

Conclusion

I had extensive use of other hosted email sanitization services like Postini. GFI did a great job in the ease of use aspect making the life of the IT professional a bit easier allowing him to manage more volume more effectively. They also provide a great experience for the end-user making it more tangible as to why email sanitization is necessary showing how many SPAM messages were filtered, every security professional agrees this: “End user training is key for the security of the enterprise” so, end-users need to be aware and services like this which are hard to maintain on a mid-sized business with an IT area of 4 guys help to move towards this vision.

Sharepoint 2010 – How to deploy custom Webparts

I have found many post in the web about how easy is to develop and debug custom Visual Webpart, here are a couple of great references

The problem is that there is not much talking about how to deploy the packaged Webpart on a different server. I run into this issue following the last post mentioned above because I am trying to deploy a custom Webpart to analyze the Claims inside a Token in a Claims based Authentication Sharepoint Web Application.

I finally came across this post (http://dotnet.sys-con.com/node/1208275) which details the deployment process as follows and is a recommended reading BWT. I am considering that you have a .WSP Webpart packaged by Visual Studio 2010 Beta.

  1. Add-SPSolution c:codeSharePointProject2bindebugSharePointProject2.wsp
  2. Install-SPSolution –Identity SharePointProject2.wsp –WebApplication http://sp2010 -GACDeployment

I will complement that post adding the final steps you should run to actually be able to see that webpart on your web application!

When adding a web part while editing a sharepoint page you might find that not all available webparts are show. This might happen if your Site Collection has not enabled all features and is also necessary for activating custom webparts. You should do the following as a Site Collection administrator,

  1. Site Actions
  2. Site Settings
  3. Manage Site Features
  4. Site Collection Features

Hope this helps!

Sharepoint 2010- Logging is out of control!

Well, I spent a couple of days working with the Sharepoint 2010 Beta and have a couple of IMPORTANT Operational Tips to keep in mind when managing the Beta at least. I have an on-premise deployment (pre-production) and a cloud deployment on Amazon EC2 (production) and in both places I run into issues by limiting the log files growth.

We expect to see this monitored on the SCOM Management Pack when the product hits RTM.

Lessons Learned

  1. Watch out the Log files on File System: By default on “C:Program FilesCommon FilesMicrosoft SharedWeb Server Extensions14LOGS” On Sharepoint generated 147Gb of log files in two weeks! Yes, 147Gb, this is kindly we fixed by the RTM release. But you should configure Central Administration > Diagnostic Logging to limit the space available for log files. I configured 1Gb for our Cloud deployment.
  2. Watch out the WSS_Logging Database: When you do a Single Server installation using SQL Express 2008 to host Sharepoint DBs, you must watch out for the growth of the DB named “WSS_Logging” which usually resides on “C:Program FilesMicrosoft Office Servers14.0DataMSSQL10.SHAREPOINTMSSQLDATA”. I had one instance which reached 4Gb and put my SQLExpress instance to the limit (Here is someone who run into the same issue). That DB cannot be truncated as it is not supported right now and it takes care of analytics of the web traffic. You can disable the Health and Data Collection gathering from Monitoring > Reporting > Configure usage and health data collection, however web analytics is very neat to be left apart.
    I was not able to find a procedure to migrate that DB to another server, so I had to disable the feature.

Key Takeaways

  1. Limit your log files usage.
  2. Deploy on server-farm mode to move the WSS_Logging db to another db server.