Nintex 2010 Workflow LazyApproval and Inline Functions – fn-Substring explained

August 14, 2013 1 comment

I recently completed an update to a workflow to enable LazyApproval.  Seemed easy enough to do.  There’s just a checkbox in the Flexi-task.  I found that after I enabled the feature and a user approved the workflow task by replying to the email that the workflow was erroring out.  After the approval request was approved or declined, the workflow was updating some fields in the InfoPath form in the form library.  Four promoted fields in an InfoPath form library – three single line of text fields and a date field. Here is the error:

The workflow could not update the item, possibly because one or more columns for the item require a different type of information.

Outcome: Unknown

This was puzzling because if the user approved the task via the Task Form, all fields were updated successfully.  I started to think the issue was with the Date field but discovered it was actually with the Approvers Comments field.  When the user replies via email with a LazyApproval, the approvers comments are the entire email.  In my case, the contents of the email exceeded the limit of the Single Line of Text field.  Since this InfoPath field was promoted as a SharePoint field I wasn’t able update the field type to accommodate this by changing the column data type in the List Settings page.  This ended up breaking the InfoPath form when I tried to publish an update.  InfoPath indicated the field mapping had changed.  Enter the fn-Substring Inline Function.

So having just narrowed down the issue to the number of characters being passed back to the field in the InfoPath Form Library I thought the use of the Inline Function fn-Substring would take care of it.  Well, not quite.  I decided to declare a workflow variable (single line of text) and set it equal to the results of the inline function fn-Substring.  Here’s what that looked like…

fn-Substring({Common:LastApproverComments}, 0, 200)

first variable: Approvers Comments, second variable: 0 – start at the first character, third variable: 200 – include the first 200 characters.  I figured this would be enough.

The result of the initial fn-Substring was just the plain text from the email preceded by the function.  fn-Substring(body of email….  This didn’t seem right.  Why would it work with some plain text in a test workflow but not with my real data.  I missed one crucial piece of information from the documentation on how to use this function.  Let me share it with you.

If text used for a function argument contains function syntax (i.e. a brace or comma character) wrap the argument with {TextStart} and {TextEnd} to indicate a block of text that should not be parsed further.

For example, if a number variable that contains a decimal value is passed to the function, and the decimal separator for your region is a comma, the {TextStart} and {TextEnd} tokens will need to be used.
The stars started to align at this point.  I needed to wrap my first variable {Common:LastApproverComments} between the {TextStart} and {TextEnd} because the contents of the email included commas open and close ().  I assumed the curly braces around Common:LastApproversComments were able to interpret that the entirety of that variable was in fact the first variable – not true.
The completed Inline Function looks like this now.
fn-Substring({TextStart}{Common:LastApproverComments}{TextEnd}, 0, 200)
It’s great that Nintex provide documentation which made troubleshooting this much easier.  It would be great if there was a cheat sheet of all the nuances to Nintex like the tokens {TextStart} and {TextEnd}.  Maybe I’ll start one and post it someday.  Hopefully this helps someone else down the road.
Advertisements

Leveraging the CloudShare platform

February 28, 2013 Leave a comment

I recently completed another project where I leveraged the CloudShare platform and I have to hand it to them again.  They have been an awesome partner to work with.  I use the word partner even though I’m customer I really feel they act like a partner.  Let me share some of my experiences of working with CloudShare.

In the last year I’ve delivered three successful SharePoint projects using the CloudShare platform.  In addition to the successful project delivered using CloudShare, having the ability to quickly spin up early versions of SharePoint 2013, Office 2013, and Windows 8 for prototyping and demoing new platform capabilities for future projects is really great.  And as much as I love installing and configuring these products, many times I just need to quickly get a new environment setup for a new project or check out some features that take some time to setup like the SharePoint 2010 Enterprise SP1 W/ SQL 2012 and BI Stack.  Such a HUGE time saver to have access to these environments.  I used to lug around an external USB drive to build out all of these environments which I have to say, I don’t really miss having that drive plugged into my machine.

The list below highlights how I’ve been able to utilize the CloudShare environments across my various projects.

  • When starting a new project, I’m able to turn on a fully configured SharePoint development environment without the configurations or leftover solutions from the previous project.
  • The “Share a Copy” feature has been so helpful.  As mentioned in a previous post, I’ve been able to development in one environment, share a copy with my client, who can then view and test current progress of the project without impacting inflight development activities which sometimes leave a working site in a non-working state.
  • The “Share a Copy” feature also enables the Development and QA teams to work in parallel without having to deal with deployment configuration activities.  That is not to say getting code, content, and configurations deployed successfully isn’t necessary but there are times where the priorities of getting that working can be done later in the project.
  • When it comes time to deploying code developed on the development environment, having the ability to spin up a new vanilla SharePoint environment consistently really comes in handy for testing the deployment.  It makes working out the deployment issues that typically occur on the clients environment abstracted without having to get access to log files and system administrators who are responsible for the pre-production and production environments.  No need to expose these issue to the client, right?
  • Pre-configured environments that typically take a considerable amount of time to install and configure are ready and can be turned on in minutes.

So, I wanted to share a few of the ways I’ve been able to leverage the CloudShare platform with the community at large because of how helpful it’s been for me.  So, thank you CloudShare for enabling me to be successful in my business!!!  I also want to say thank you for all the prompt responses to support tickets submitted.  I had a few instances where some issues a rose over the weekend and the CloudShare Support Team was prompt to respond and resolve my issues quickly even though based on the stated SLA, I wasn’t expecting to hear back until Monday.  Way to go above and beyond!!!

I hope these experiences open new opportunities for anyone looking to capitalize on the service offerings from CloudShare.

Categories: CloudShare

One solution to too many cooks in the SharePoint kitchen

June 14, 2012 1 comment
Okay, it’s been a while since my last post.  I have a few in development now and will release them soon but I wanted to share a recent experience I had on a project.  
 
Recently as I was working in my CloudShare “development” environment I ran into a situation that all developers have run into.  The client says to you, “hey, I’d like to be able to see how it’s coming along.”  Seems all well and good so you create a login account and they begin to poke around.  This is when it hits you, “oh my gosh, how do I get them out of the kitchen?”  They want to watch the water boil, open and close the oven door to check on how things are coming along.  This is when panic sets in because you then realize, there is no way I’ll be able to keep moving forward while they are in poking around and asking questions why this doesn’t work or that doesn’t look right.  The problem with this is, sometimes when something “doesn’t work” it is not because it is broken, it just hasn’t been implemented yet.  So you spend the next several days trying to fix things, out of order, and you don’t make the progress you were on track to do.
 
So what do you do?  This is what happen to me and I began to think about ways around this.  First I thought, “hey, I could stand up another CloudShare server and deploy my code updates to it and when building solutions on SharePoint there are more than just code updates.  So the reality of trying to setup a true content, configuration and code deployment environment and process seems like it could be more effort than it is worth at this point.  Ultimately, yes this is a preferred method but when you are trying to get something stood up and prototyped quickly this is a lot of additional effort.  
 
At this point, the wheels started spinning in my head.  I thought, “what if I were to take a snapshot of my current ‘dev’ environment and the restore it to a ‘new’ CloudShare environment when I am ready to share changes with the client?  Hmm….” And as I thought about it more it became clear this was the way to go.  So I tested it out.  It just so happened that this happened on the same day that the “Vanity URL’s” product feature became available.  How perfect for me.  So here is what I did:
 
  1. Created a Snapshot of my “Development” environment
  2. Purchased another CloudShare environment
  3. Restored the Snapshot to the new environment
  4. Once the environment was restored, I noticed that AD user account passwords were reset.  That was okay because I only had a few accounts in AD so it wasn’t that much to reset their passwords.  
  5. Created 2 new Vanity URL’s
    • ProjectName-Dev.cld.sr
    • ProjectName-QA.cld.sr
  6. Assigned these new URL’s to each environment respectively.
  7. Updated Central Admin – Alternate Access Mapping (AAM) with the new Vanity URL’s
  8. In my case, I had to also update Search to use the new URL when displaying search results.
 
And that was it.  I now have a safe place for the client to poke around and look, while at the same time not having them ask questions why something isn’t working because you are working on it and it is purposefully broken. The best part about this is, it is a repeatable process that can be done quickly without much effort in the background while continuing to work.
 
I must also give a SHOUT OUT to CloudShare for providing an excellent product and GREAT customer service.  They have always been quick to respond to any questions or issues I’ve had.  Way to go!!!!
Categories: Uncategorized

Performance problems with InfoPath Form People Picker resolving names from AD

November 2, 2011 Leave a comment

I encountered an issue where the time to resolve a name from AD in an InfoPath Browser Form People Picker control was taking upwards of 30 seconds, when previously it only took 2-3 seconds.  After further investigation, there appeared to be an issue with a recent change to AD that was causing this performance issue.   AD has approximately 13K user objects in one forest.  And under this current configuration the form would resolve the name in 2-3 seconds.

The problem started to surface last week when the organization began preparing for a merger with an organization of similar size (roughly 13K user objects in their forest).   A two way trust was setup between the two forests in preparation for getting email and calendars working for both organization today.  As you can imagine the communication of these changes and their impact hadn’t been communicated to all potential parties that might be affected by this change.  Subsequently, it was brought to our attention by the user community of the InfoPath forms that they were taking so long to resolve names in the people picker control.  After some searching around to diagnose the issue, we happened upon something I had used in a previous project which is to update the property on SharePoint using stsadm – setsiteuseraccountdirectorypath.  After setting this property we actually saw an improvement in the time from the original 2-3 seconds to not quite 1/2 second.  Who knew the fix would actually improve the previous time.

Below is the script used to make this update.  Of course you only need the stsadm line but if you have more than one site to fix or could see benefit to using this in the future the script below can be saved as a batch file.  You’ll also need to update the ‘-path’ to the OU where you users are located in AD.  And if you need to reset the property just set the ‘-path’ to “” which will return the setting to the default which is configured out of the box.

@echo off
echo Path of site collection with the people picker that needs fixing:
set /P siteURL=

echo Fixing the people picker…

stsadm -o setsiteuseraccountdirectorypath -url %siteURL% -path “ou=NAME OF OU,dc=DOMAIN NAME,dc=com”

echo People picker fixed.

Hopefully this helps someone else out there.  Thankfully it only burned a couple of hours of time this morning which could have led to a more exhaustive exercise with the AD team who was already plenty busy getting email and calendaring working for 26K users vs. 50-100 users who have to wait for a simple little people picker in a handful of InfoPath forms.

UPDATE:  11/18/11

There is another script you can use if you want to add more than one OU. 

stsadm -o setproperty -pn “peoplepicker-serviceaccountdirectorypaths” -pv “ou=OU1,ou=OU2,dc=domainname,dc=Extension” -url http://server:port/sitecollection

Giving back

November 1, 2011 Leave a comment

I’ve been a consumer of many blogs over the years that have been extremely helpful to me in my professional and career development.  I have threatened many times that I am going to return the many favors.  Well here I am on 11/1/11 putting my stake in the digital ground on the world wide web.  I am looking forward to using this blog to share my experiences and learnings as it relates to SharePoint and the eco-system that has evolved around this amazing platform.

Categories: Uncategorized