IgorShare Thoughts and Ideas

Consulting and Training

Archive for February, 2008

SkyDrive PowerShell provider – far beyond reach

Posted by Igor Moochnick on 02/22/2008

After the MS announcement of increasing the SkyDrive size to 5Gb, I realized that it is becoming a great asset for multiple projects I’m working on. I’ve decided to slap together some code that will give a a command-prompt access to this resource. PowerShell PSDrive provider was a perfect platform for this functionality, but … It appears that (first of all) there is no SkyDrive API. After analyzing the html content, I’ve realized that it’s XHTML and can be a perfect candidate for scraping with XLINQ, but (second and last) MS SkyDrive renderer is not generating a valid XHTML! Every attempt to consume it with any XML reader fail with Exception (something similar to: “unexpected token ‘=’, expecting token ‘;'”).

So, bottom line, the SkyDrive is just a nice toy very far from being a useful product.

As far as I know there is no rumors from Microsoft that the SkyDrive API is even considered, but some people have made successful attempts to scrape the data from the SkyDrive (SkyDrive viewer on CodePlex) by using simple HTML parsing.

Posted in PowerShell, SkyDrive, Thoughts | 8 Comments »

Miguel Castro and Mark Dunn on the Marriage of WF and WCF

Posted by Igor Moochnick on 02/21/2008

Check out new Show #103 on DnrTv:

Miguel Castro and Mark Dunn on the Marriage of WF and WCF

Posted in Tutorials, Workflows | 1 Comment »

PowerShell Rules Engine: How-To Technical Details

Posted by Igor Moochnick on 02/15/2008

In my recent article, Marrying Workflows into PowerShell (Part 1 of 2), PowerShell Rules Engine, I’ve shown an example of how to use a Rule Set editor, shipped with Workflows Foundation, in your own application and how to evaluate rules against managed objects. Actually, PowerShell was used as a specific example of how to use this technology.

I haven’t invented anything new and this information is readily available from different public sources, but, I think, it’ll be better if I’ll explain how it works for the readers of this blog.

There are 3 simple steps to get this technology to work for you:

  1. Create and edit rule sets via the Rule Set Dialog and serialize the rules into an XML
  2. Load (deserialize) the Rule Set from an XML file
  3. Evaluate/apply the rules against your own objects

The step #1 is implemented in the ShowDialog function:

   1: public static RuleSet ShowRulesDialog(Type targetType, RuleSet ruleSetIn, string rulesFileName)
   2: {
   3:     RuleSet ruleSet = ruleSetIn ?? new RuleSet();
   4:     RuleSetDialog ruleSetDialog = new RuleSetDialog(targetType, null, ruleSet);
   5:     DialogResult dr = ruleSetDialog.ShowDialog();
   6:     if (dr == DialogResult.OK)
   7:     {
   8:         ruleSet = ruleSetDialog.RuleSet;
   9:         if (ruleSet == null)
  10:             return null;
  12:         XmlTextWriter writer = new XmlTextWriter(rulesFileName, Encoding.Unicode);
  14:         WorkflowMarkupSerializer serializer = new WorkflowMarkupSerializer();
  15:         serializer.Serialize(writer, ruleSet);
  17:         writer.Flush();
  18:         writer.Close();
  19:     }
  21:     return ruleSet;
  22: }


Note line #3 – if you’ll provide a valid instance of a Rule Set – the editor will allow you to edit it, if you don’t have a Rule Set, provide a new instance in the editor will fill it in with your new rules. In the line #15 the newly edited Rule Set is written (serialized) to an XML file with WorkflowMarkupSerializer.

The step #2 is implemented in the LoadRules function:

   1: public static RuleSet LoadRules(string rulesFileName)
   2: {
   3:     if (File.Exists(rulesFileName))
   4:     {
   5:         FileStream fs = new FileStream(rulesFileName, FileMode.Open);
   6:         StreamReader sr = new StreamReader(fs);
   7:         string serializedRuleSet = sr.ReadToEnd();
   8:         WorkflowMarkupSerializer serializer = new WorkflowMarkupSerializer();
   9:         StringReader stringReader = new StringReader(serializedRuleSet);
  10:         XmlTextReader reader = new XmlTextReader(stringReader);
  11:         var ruleSet = serializer.Deserialize(reader) as RuleSet;
  12:         fs.Close();
  14:         return ruleSet;
  15:     }
  16:     return null;
  17: }


Note that this function is doing a reverse Rule Set deserialization with the WorkflowMarkupSerializer and returning a Rule Set instance (if everything wen right).

The last, and the most important step #3, is executed in the Run-Rules CmdLet. It evaluates the rules against each and every PSObject in the command pipeline.

   1: // Validate the rules
   2: RuleValidation validation = new RuleValidation(typeof(PSCurrentItem), null);
   3: ruleSet.Validate(validation);
   4: if (validation.Errors.Count != 0)
   5: {
   6:     foreach (ValidationError error in validation.Errors)
   7:         WriteWarning(string.Format("Validation Error: {0}", error.ErrorText));
   8: }
   9: if (validation.Errors.Count != 0)
  10:     return;
  12: ...
  14: var psObject = new PSCurrentItem() { CurrentItem = this.CurrentItem, Runtime = this };
  16: // Execute the rules against the object
  17: RuleExecution execution = new RuleExecution(validation, psObject);
  18: ruleSet.Execute(execution);
  20: if (! psObject.StopProcessing)
  21:     WriteObject(CurrentItem);

First the rule set is validated against the target type (lines #2-3), the same type used while creating the rule set in the first time (line #4 in ShowRulesDialog). Then the rules are applied to the managed object (lines #17-18). The rules can modify the object which can change the flow of the logic. In line #20 the code checks if the StopProcesing flag was set or cleared which forces the object to be written back to the command pipe or omitted.

Hope this post was educational and showed how easy it is to use the Rule Set editor in your applications.

Posted in C#, PowerShell, Tutorials, Workflows | 1 Comment »

WF are not only for business (part 3): Marrying Workflows into PowerShell (Part 1 of 2), PowerShell Rules Engine

Posted by Igor Moochnick on 02/15/2008

[ Update: PowerShell Rules Engine: How-To Technical Details ] 

Workflow Foundation gives us another “gem in a rough” – Rules Evaluation Engine. It is like a product within a product. [It is “rough” in a sense as in “buried within”.]

As you know that in a lot of cases a simple if-else-then rule is either “too dumb” or it takes a lot of code (a set of rules) to express your requirement. In this case the Workflow Rules Engine comes handy. Among other great features, it allows you to define rules priorities, status and chaining policy. This can easily control the order in which these rules are evaluated (and even reevaluated). These rules can be executed against any managed object. The Rules Editor, that you’re getting with the Workflow Foundation, does the type reflection behind the scenes and provides you with a reach editing interface, which includes intellisense as well as syntax evaluation.

To make my case easier to explain, let’s take an extremely simple and exaggerated example: “let’s delete all the files that are more than 10 bytes”.

This is how the RuleSet will look like:


This is what will happen:

  • The first rule “IsFile” checks if the “current” item is a directory. If it is – the evaluation will “halt” for the current item and the system will move to the next item. If it’s a file the, rules will continue to evaluate.
  • The second rule “IsBig” will execute the “Then” script only if the file size (“Length”) is more than 10 bytes.

There are a lot of interesting things you can do with the rules, but first you have to understand how they work.

Note the order of the rules execution (crash guide for dummies, since it’s a huge topic and I can’t put everything in one post):

  1. Only the active rules will be executed
  2. The rules with the higher priority will be executed earlier (0 is is lowest priority)
  3. The rules within the same priority will be executed in the ALPHABETICAL order
  4. If you have the FullChaining turned on – the rules will be reevaluated until either nothing to reevaluate or “Halt” command was called
  5. for more info see great article by Scott Allen “Windows Workflow: Rules and Conditions” and, of cause, MSDN.

If you want to start playing with this technology go ahead and download my code. You’ll find there 3 CmdLets. Here is a short intro:

  1. New-Rules file-name    – will open a new Rule Set Editor and will allow you to create your own rule-set.
  2. Edit-Rules file-name     – will allow you to edit already existing Rule-Set
  3. Run-Rules file-name item-pipe    – will execute the rules against each and every item in the pipeline

This is how you can execute the Rule Set against a PowerShell command pipe:

dir c:\logs | Run-Rules rules.xml | % { "copy $_ c:\temp" }

All the items, after the rules were applied, will continue traveling the command pipe. If you’d like to stop an item to go through add an action:

this.StopProcessing = true

The code is operational as it is, so, what are you waiting for? Go ahead and unleash your imagination!

But first, don’t forget to take the code from the usual place imageon my web site.

Note: the code is provided only as a sample and a good starting point for your projects. If you’ll decide to make any changes and to share them with the community – send the code to me and I’ll upload it.

For more info:

MSDN article: Introduction to the Windows Workflow Foundation Rules Engine

Talk by Michael Stiefel: Using the WF Rules Engine Outside of a Workflow (and the attached presentation with a code sample)

Posted in C#, PowerShell, Tutorials, Workflows | 2 Comments »

Windows Workflow Foundation Tutorial Series

Posted by Igor Moochnick on 02/14/2008

Scott Hanselman just pointed my attention to a great resource by Joe StagnerWindows Workflow Foundation Tutorial Series” when I’ve mentioned that “I don’t see a lot of news around the Workflows – it looks like, their adoption is slowing down”.  Thank you, Scott!

It’s a great start, but I still hope that the adoption of the Workflows technology will start picking up.


Joe Stagner points our attention to the following WebCasts series:

Intro to Windows Workflow Foundation (Part 1 of 7): Workflow in Windows Applications (Level 100)

Intro to Windows Workflow Foundation (Part 2 of 7): Simple Human Workflow Using E-mail (Level 200)

Intro to Windows Workflow Foundation (Part 3 of 7): Hosting and Communications Options in Workflow Scenarios (Level 300)

Intro to Windows Workflow Foundation (Part 4 of 7): Workflow, Messaging, and Services: Developing Distributed Applications with Workflows (Level 300)

Intro to Windows Workflow Foundation (Part 5 of 7): Developing Event Driven State Machine Workflows (Level 300)

Intro to Windows Workflow Foundation (Part 6 of 7): Extending Workflow Capabilities with Custom Activities (Level 300)

Intro to Windows Workflow Foundation (Part 7 of 7): Developing Rules Driven Workflows (Level 300)

Posted in Thoughts, Tutorials, Workflows | Leave a Comment »

WF are not only for business (part 2): Marrying PowerShell into Workflows (Part 1 of 4)

Posted by Igor Moochnick on 02/13/2008

See other posts in the series “Workflows are not only for business”:
      Part 1: “Workflows are not only for business logic, or how to define an SMTP protocol state machine in a Workflow.”

clip_image001[3] Nowadays the complexity of the management systems and management-related processes reaches the new heights. This is a theme for a separate discussion and not the scope of this post, but, as I like to say, it can be attributed to a simple fact: “because we can do so!” 🙂 Actually what I mean here is the processing power of the current systems and the flexibility of the tools allow us this.

All this means that there is never-ending race for creating better and simpler tools for the administrators and for the system analysts to define and seamlessly implement complex processes, such as:

  • Provisioning Systems
  • Operations Management
  • System Control
  • Automation Processes
  • etc …

To make my point lets discuss a “down to earth” example. Let’s say we have a server farm that stores and manages customers media content – something like YouTube. Imagine how many tasks are involved to make such system tick, like: content conversion, distribution, expiration, etc… Note that I haven’t even mentioned the usual suspects like: backups, account management, etc… These tasks are constantly changing and evolving with the system. This means they should be adjusted accordingly in timely manner. As you you know it’s pretty mundane to write and update the code as well as the shell scripts. As I see it, the workflows are the perfect candidate to take off (or, at least, relax) the management overhead.

Hmm… Let’s see what do we have laying around that can help us in this quest. We have an amazing tool for the analysts to define the processes – the Workflows, and another, not less amazing and extremely flexible, tool, fine tuned for solving most of the IT-related issues – the PowerShell. A match made in haven 😉

Thinking how many different scenarios can benefit from the PowerShell flexibility, makes me ask a question: “Why not to make the PowerShell a first-class citizen in the Workflows?”

[The picture on the right is a teaser and shows a possible (exaggerated) scenario. I’ve put the Xceed controls as a thank you to Xceed for thinking about the community and providing community version of their controls for public use]

Continue reading for more information about using and implementing the PowerShell activities …


Read the rest of this entry »

Posted in C#, PowerShell, Thoughts, Tutorials, Workflows | 6 Comments »

My speaking gigs on the next Code Camp #9 (April 5-6, Waltham, MA)

Posted by Igor Moochnick on 02/12/2008

This is the list of the presentation I’ll be giving during the upcoming Code Camp #9 (April 5-6, Waltham, MA):

  • Workflows are not only for serious dudes (for business)
    How the Workflows can be used outside of its obvious boundaries in the places you didn’t even think possible.
  • Creating Database agnostic applications
    Good practices, Tips and Tricks of how to create your application that are targeted to work with multiple types of DBs.
  • Enterprise Service Bus, huh?
    Breaking the boundaries of the enterprise – advanced communications in BizTalk Services domain
  • Give it a REST
    WCF, REST, POX, URLP and other animals

Let me know if there are any other topics you want me to cover or what issues you’d like me to address.

Posted in Presentations, Tutorials | Leave a Comment »

INETA Speakers Bureau to capture Speakers Presentations

Posted by Igor Moochnick on 02/12/2008

Lately INETA Speakers Bureau looking into a different ways to start capturing presentations in a Podcast or VideoCast forms and make it available for the broader public.

(As a member of this bureau) I’m looking for volunteers that can give me a hand during the upcoming Code Camp #9 (April 5-6, Waltham, MA), I’m looking for people who can operate a camera, do video editing. I’m looking for people who can volunteer their equipment as well.

Please contact me any time: igor_moochnick_at_yahoo_dot_com.


Posted in INETA | Leave a Comment »

Exposing dynamic data via RSS feed

Posted by Igor Moochnick on 02/11/2008

In my recent article “RSS is not only for the news” I’ve mentioned that the current Syndication API is pretty flexible and allows a great deal of control of what is sent over the wire. This allows you to give the client almost anything it asks for. For example your application can provide a graphical information, a binary data, etc… Think about all the different way you can present your data.

Here is a simple example: let’s say you have a resource that is changing over the time (it can be traffic data). You can provide a link to this data and as soon as a user (or another service) navigates there – the data (for example, XML) can be streamed as a response. But what if it’s a real user and you want to be friendly and present this data in a graphical form (render the graph)? You can do that by just adding additional parameter to the URL that will tell your service what format the data can be (should be) presented in.

Remember the feed contract?

public interface IFileChangeFeed
    // ... other operations

    [WebGet(UriTemplate = "file?path={path}")]
    Stream GetFile(string path);

Note that the GetFile request has a URL parameter. In my case the parameter tells the service what file to retrieve, but in your case it can tell your service what data to generate and in what format. [Note: please don’t deploy this example on the real systems until you know EXACTLY how and who is accessing your feed. The sample, deployed AS-IS, exposes your system to the external attacks]

The URL request that will be routed to the GetFile method can look like this: http://yourHost/service/location/file?path=fileName.ext

The WCF will find a match in your contract and will call the GetFile method, substituting the “path” variable with the value of the “path” parameter.

public Stream GetFile(string path)
    if (File.Exists(path))
        WebOperationContext.Current.OutgoingResponse.ContentType = "text/plain";
             new FileInfo(path).Length.ToString());
        return new FileStream(path, FileMode.Open, FileAccess.Read);
        return null;


This is a pretty straightforward way to return back any type of data, depending on the incoming request. As you can see it’s extremely simple, so unravel your imagination and let REST be your friend!

BTW: Just found an MSDN article, HTTP Programming with WCF and the .NET Framework 3.5,  talks about similar issues and guides you through other REST-related examples.

Posted in C#, REST, Tutorials, WCF | Leave a Comment »

Great PowerShell resource to watch

Posted by Igor Moochnick on 02/09/2008

Check out Keith Hill’s blog for the Effective PowerShell series:

Effective PowerShell Item 1: The Four Cmdlets That are the Keys to Finding Your Way Around PowerShell

Effective PowerShell Item 2: Use the Objects Luke. Use the Objects!

Effective PowerShell Item 3: Know Your Output Formatters

Effective PowerShell Item 4: Commenting Out Lines in a Script File

Effective PowerShell Item 5: Use Set-PSDebug -Strict In Your Scripts – Religiously

Effective PowerShell Item 6: Know What Objects Are Flowing Down the Pipe

Effective PowerShell Item 7: Understanding “Output”

Effective PowerShell Item 8: Output Cardinality – Scalars, Collections and Empty Sets – Oh My!

Effective PowerShell Item 9: Regular Expressions – One of the Power Tools in PowerShell

Effective PowerShell Item 10: Understanding PowerShell Parsing Modes

Posted in PowerShell, Tutorials | 2 Comments »

%d bloggers like this: