MSDN VB Dev Center Related Posts by ThinqLinq

Custom Logging with Entity Framework EF6

Coming from LINQ to SQL, I’ve long been a fan of its logging simplicity where you just set the context.log to a stream (console.out) to see the stream of sql statements being issued to the database. Prior to EF6, this has been a rather frustrating omission that has finally been rectified with EF 6. Now, you can log SQL statements just as simply, but with a slightly different implementation. In EF, you set the DbContext’s Database.Log property to an Action delegate that takes a string as input parameter. Thus to log requests to the Diagnostics Trace implementation, you simply set the following:

context.Database.Log = Sub(val) Diagnostics.Trace.WriteLine(val)

This is fine as long as you are willing to accept the default logging implementation. If you want to customize the log output, things get a bit trickier. In my example, I only want to log the calling method and elapsed time that the query took to execute. I’m not as concerned in my current need for the SQL string, but that is easily included as I’ll point out below.

To start, I need to set the calling method from my code that I’ll be able to access in the logging implementation. Since the calling code is actually in ASP.Net Web API calls returning IQueryables, and the database is not being executed until long after my application code has completed. I need to explicitly identify the application calling method rather than the last method from the call stack which would otherwise be one of the Web API internal methods. To handle this and centralize my set-up logic for the context, I’ll create a factory method which configures the context and use that instead of relying on the context’s constructor. Thanks to partial classes, I can extend the generated entity class and include this new property and factory constructor. To make matters even easier, I’ll take advantage of the CallerMemberName attribute to automatically pull the member name from the method that is calling the factory.

Public Class NorthwindEfEntities

    Public Shared Function ContextFactory(<CallerMemberName> Optional memberName As String = "") As NorthwindEfEntities
        Dim context = New NorthwindEfEntities()
        context.CallingMethod = memberName
        context.Database.Log = Sub(val) Diagnostics.Trace.WriteLine(val)
        Return context
    End Function

    Public Property CallingMethod As String

End Class

Now, to create the context we call this new ContextFactory method, but don’t pass the memberName explicitly. The compiler will add that for us automatically.

    Public Function GetCustomers() As IQueryable(Of DTO.DtoCustomer)
        cn = NorhwindEfEntities.ContextFactory
        ' Do some amazing query and return it.
   End Function

Now that we’ve set up the logger, we need to customize the output that is generated. To do this, we need to add a new class that derives from DatabaseLogFormatter (in the System.Data.Entity.Infrastructure.Interception namespace). If you don’t have this namespace, you may need to upgrade to EF6 on order to access the logging functionality. Since the base class doesn’t have a default parameterless constructor, we need to make a new one and simply delegate to the base implementation. With that out of the way, we can supply our own logging. The base implementation gives us hooks to the following interception points.

Method Description
LogCommand Writes the SQL Statement to the action implementation prior to executing the statement
LogResult Writes when the SQL statement has completed
LogParameter Writes the parameter(s) used in the query
Executing/Executed Called before and after the database request is made
NonQueryExecuting/NonQueryExecuted Called for queries that don’t return results (insert/update/delete and non-result return stored procedures)
ReaderExecuting/ReaderExecuted Called for queries that return tabular data results (select)
ScalarExecuting/ScalarExecuted Called for queries that return single value results (user defined functions)
Write Base write implementation that each of the loggers use to format the output.

As stated before in our example, we want to write a log entry when the command completes including the calling method name and execution time. To do this we need to perform the following adjustments to the base implementation: 1) suppress logging the SQL statements in the LogCommand method by simply creating a noop and not delegating to the base implementation, and 2) replace the default result log information with our custom output in the LogResult method. To get the elapsed time, we directly leverage the Stopwatch property of the base class. Here then is our new custom formatter.

Imports System.Data.Entity.Infrastructure.Interception
Imports System.Data.Entity

Public Class CustomDbLogFormatter
    Inherits DatabaseLogFormatter

    Public Sub New(context As DbContext, writeAction As Action(Of String))
        MyBase.New(context, writeAction)
    End Sub

    Public Overrides Sub LogCommand(Of TResult)(command As Common.DbCommand, interceptionContext As DbCommandInterceptionContext(Of TResult))
        'MyBase.LogCommand(Of TResult)(command, interceptionContext)
    End Sub

    Public Overrides Sub LogResult(Of TResult)(command As Common.DbCommand, interceptionContext As DbCommandInterceptionContext(Of TResult))

        Dim context = interceptionContext.DbContexts.OfType(Of NorthwindEfEntities).FirstOrDefault()
        If context IsNot Nothing Then
            Trace.WriteLine(context.CallingMethod + " Completed in " + Stopwatch.ElapsedMilliseconds.ToString)
        End If
   
    End Sub
End Class

One step remains in order for this new logging implementation to take effect. We need to register the custom logger in our solution. To do this manually in code, we create a new class and derive from DbConfiguration. Then in the constructor, call the base class’s SetDatabaseLogFormatter to set the formatter to our new CustomDbLogFormatter. For more information on configuring the logging via your config file rather than in code, see AJ Vicker’s post today on EF 6.1 turning on SQL logging without recompiling your app.

Public Class LogConfiguration
    Inherits DbConfiguration

    Public Sub New()
        SetDatabaseLogFormatter(Function(context, action) New CustomDbLogFormatter(context, action))
    End Sub
End Class

Naturally, the value that we are outputting is just an example, you’re free to make your logging as complex as you want given these basic building blocks. For more details, make sure to read the EF team’s posts on the logging implementation.

Posted on - Comment
Categories: Entity Framework - VB Dev Center -

Beware of Async Sub or Void

At my VS Live last week, I gave a survey of Asynchronous programming from .Net 1 through .Net 4.5. As part of the presentation, I showed off this simple example of doing Async starting with a synchronous example. Here’s the beginning example:

Sub Main() DoWorkAsync() Debug.WriteLine("All Done")

Console.ReadLine() End Sub Sub DoWorkAsync() PrintIt() Debug.WriteLine("Done Async") End Sub Public Sub PrintIt() Dim text = "Hello World" Task.Delay(2000) Debug.WriteLine(text) End Sub

Before continuing on, try to figure out what you would see in the output window. For those impatient, here’s the output:

Hello World
Done Async
All Done

However, if you try this code, you will notice that the output is all produced prior to the expected 2 second delay (from the Task.Delay). Why is the delay ignored, because the TPL schedules the delay operation on a different thread than the main UI one and then lets the rest of the code operate on the main thread. We can force the delay to pause by changing the PrintIt method as follows:

    Public Sub PrintIt()
        Dim text = "Hello World"
        Task.Delay(2000).Wait()
        Debug.WriteLine(text)
    End Sub

Now, we do delay for the 2 seconds as expected, but we loose any asynchronous behavior that we expected. As a result, we should use the new Async/Await keywords to make our PrintIt method async:

    Public Async Sub PrintIt()
        Dim text = "Hello World"
        Await Task.Delay(2000)
        Debug.WriteLine(text)
    End Sub

Now, if we check our output, we’ll see the following results:

Done Async
All Done
(2 second pause)
Hello World

Notice here that the Done Async and All Done messages appear before Hello World.  Why? Because when the line with the Await is encountered, control is passed back to the calling method (DoWorkAsync) and schedules the continuation of the await operation on the thread context of the Task.Delay operation. This is just one of the problems with the “Async Sub” or in C#, “async void” patterns. They are acceptable for fire and forget operations, but can cause issues if you want to rely on structured exception handling, resource disposal, and a number of other useful constructs, you shouldn’t use Async Sub. For more information, see Lucian Wischik’s Async Patterns article or any of a number of excellent articles by Stephen Toub.

So, how do we call the PrintIt operation asynchronously and ensure that it completes prior to continuing the “Done Async”  operation? We need to change the signature of the PrintIt method to return a Task rather than returning nothing (void). We then need to move the await up the stack to also Await PrintIt and mark DoWorkAsync as Asynchronous.

    Sub Main()
        DoWorkAsync()
        Debug.WriteLine("All Done")
        Console.ReadLine()
    End Sub

    Async Sub DoWorkAsync()
        Await PrintIt()
        Debug.WriteLine("Done Async")
    End Sub

    Public Async Function PrintIt() As Task
        Dim text = "Hello World"
        Await Task.Delay(2000)
        Debug.WriteLine(text)
    End Function

Now when we run our program, we see the following output:

All Done
Hello World
Done Async

Notice here that the “All Done” message appears before the 2 second delay, but Hello World and Done Async come out in the expected order showing how the DoWorkAsync operation was indeed run asynchronously from the rest of the app. If you want to see the internals of how the compiler interprets the Async and Await keywords, see my earlier post on using ILSpy with Async.

Also, if you are interested in using Async with Asp.Net or WCF, make sure to check out the Async session from Asp.Net Conf 2012 which details some of the potential issues you should consider there.

Posted on - Comment
Categories: VB Dev Center - VB -

JSON Literals for VB

One of the stand-out features of VB.Net since VB9 has been the inclusion of XML Literals. Followers of this blog should be well familiar with the concept because I first wrote about XML Literals way back in 2006. With them, we can imbed XML directly into our code as follows:

Dim value = <root>
                <child attrib="foo">bar</child>
            <root>

While XML Literals make the task of working with XML more of a joy than a necessary evil, they do add a certain level of complexity to the language. Any language feature then needs to be maintained moving forward. When I asked Anders about adding them to C#, he pointed to the ongoing maintenance issue along with the supposition that although XML was becoming a de-facto data persistence syntax, would adopting the literals into the language then set a precedence that they would need to support other persistence mechanisms when XML was replaced with some other syntax.

With the rise of REST and decline of SOAP, we have indeed seen the popularity of XML wane in favor of the more compact json syntax. The popularity is increased due to the fact that most JavaScript clients make parsing json into objects trivial. As a result, I have had conversations at conferences joking about the need for json literals in the language as well. At a recent conference an idea came to me which could actually make them (almost) a reality.

At its heart, json is simply a string representation of the JavaScript object structures. These object structures behave much like the .Net dynamic PropertyBag object. As a result, all we really need is a way to embed a long, multi-line string into our VB code and then parse it into a dynamic PropertyBag in order to consume it. Unfortunately, VB doesn’t support multi-line strings. However, it does support multi-line XML. All we need to do is wrap the multi line string inside of an XML element literal:

        Dim jsonLiteral = <js>[
            {author:'Jim Wooley', bookName:'LINQ in Action'},
            {author:'Frank Herbert', bookName:'Dune'},
            {author:'Joe Albahari', bookName:'LINQ Pocket Reference'},
            {author:'Joseph Rattz', bookName:'Pro LINQ'},
            {author:'Charlie Calvert', bookName:'Essential LINQ'}
        ]</js>

With that, we just need to decode the jsonLiteral.Value into a dynamic object. A quick search of Stack Overflow finds a number of handy options for this task. For the sake of this example, I’m just going to use the System.Web.Helpers.Json library that’s part of MVC. We’ll create an extension method to take an XElement and convert it into a dynamic object using the Json.Decode method:

<Extension>
Public Module JsonExtensions
    <Extension>
    Public Function JsonDecode(input As XElement) As Object
        Return Json.Decode(input.Value)
    End Function
End Module

With this in place, we can now operate on the literal just as if it was any other dynamic object type. Here’s the full code for this example.

Option Strict Off

Imports System.Runtime.CompilerServices
Imports System.Web.Helpers

Public Class Test1

    Public Sub TestJsonLiteral()
        Dim jsonLiteral = <js>[
            {author:'Jim Wooley', bookName:'LINQ in Action'},
            {author:'Frank Herbert', bookName:'Dune'},
            {author:'Joe Albahari', bookName:'LINQ Pocket Reference'},
            {author:'Joseph Rattz', bookName:'Pro LINQ'},
            {author:'Charlie Calvert', bookName:'Essential LINQ'}
        ]</js>

        For Each book In jsonLiteral.JsonDecode()
            Console.WriteLine(book.author)
        Next
    End Sub
End Class

<Extension>
Public Module JsonExtensions
    <Extension>
    Public Function JsonDecode(input As XElement) As Object
        Return Json.Decode(input.Value)
    End Function
End Module

I’m sure that there are features that this technique doesn’t cover (including LINQ because Object isn’t directly convertible to IEnumerable). I’m also not sure if this really has any practical benefit. It just is interesting to consider.

Posted on - Comment
Categories: VB - VB Dev Center - Linq to XML -

Hierarchical Trees from Flat Tables using LINQ

I’m often tasked with creating a tree representation of a structure from a flat self-referencing table. For example, in the EF extensions to Northwind, they extended the Employee table so that it has a self-referencing “ReportsTo” column. As you can see from the data below, Andrew Fuller does not report to any other employees, but Nancy, Janet, Margaret, Steven, and Laura all report to Andrew (because their ReportsTo value is the same as Andrew’s EmployeeID). Likewise Michael, Robert, and Anne all report to Steven.

image

In order to generate a tree representation of these records, we could start with any records that have no direct reports and then lazy load each of their children. Unfortunately for large graphs, the number of database hits will grow exponentially as we add tree levels. Typically with parent-child relationships, we could eager load the children using the DataLoadOptions with LINQ to SQL or .Includes with Entity Framework. However with self-referencing entities, this isn’t allowed. With LINQ to SQL, you will get an InvalidOperationException “Cycles not allowed in LoadOptions LoadWith type graph.”

So, how do we load the tree in one pass and build the object graphs? It’s really not that hard once you realize how reference types (classes) work in .Net. Let’s start by creating a holder for each employee and their associated direct reports:

Public Class HierarchicalEmployee
    Public Sub New(emp As Employee)
           Model = emp
    End Sub
    Public Property Model As Employee
    Public Property DirectReports As New List(Of HierarchicalEmployee)
End Class

Now that we have this type, we can fill it using a simple LINQ request. In order to optimize the next step, we’ll push the values into an in-memory Dictionary indexed by the EmployeeID:

Dim allEmployees = Employees.
    Select(Function(emp) New HierarchicalEmployee(emp)).
    ToDictionary(Function(emp) emp.Model.EmployeeID)

Next, we iterate over the full list. For records that have a ReportsTo value, we’ll add their object pointer to their parent’s DirectReports list:

For Each emp In allEmployees.Values
  If emp.Model.ReportsTo.HasValue Then
    allEmployees(emp.Model.ReportsTo.Value).DirectReports.Add(emp)
  End If
Next

Notice, here we take advantage of the Dictionary’s hash rather than having to iterate over the list each time we want to find the parent record. Finally, instead of returning the full list, we only return the employees that don’t have any children (where the ReportsTo is null).

 
Dim rootEmployees = allEmployees.Values.
    Where(function(emp) Not emp.Model.ReportsTo.HasValue())

If we want to test this out in LinqPad, just use the Dump method on the resulting rootEmployees. As a result, you’ll see the following in the results pane. Notice Andrew is the only root object. He has 5 direct reports and one of his reports had 3 reports. You could just as easily bind this to a treeview control or output it using your favorite UI tooling.

 

image

The nice thing about this solution is that if we check the generated SQL, we will just see a simple (single) SQL request to generate the entire graph. As a summary, here’s the complete code from the LinqPad sample:

 

Sub Main
    Dim allEmployees = Employees.
        Select(Function(emp) New HierarchicalEmployee(emp)).
        ToDictionary(Function(emp) emp.Model.EmployeeID)

    For Each emp In allEmployees.Values
        If emp.Model.ReportsTo.HasValue Then
            allEmployees(emp.Model.ReportsTo.Value).DirectReports.Add(emp)
        End If
    Next
    
    Dim rootEmployees = allEmployees.Values.
        Where(function(emp) Not emp.Model.ReportsTo.HasValue())
        
    
    rootEmployees.Dump
End Sub

Public Class HierarchicalEmployee
    Public Sub New(emp As Employee)
           Model = emp
    End Sub
    Public Property Model As Employee
    Public Property DirectReports As New List(Of HierarchicalEmployee)
End Class
Posted on - Comment
Categories: VB Dev Center - LINQ - Entity Framework -

Aggregate clause issues

I was reviewing a Stack Exchange message regarding the Aggregate clause in VB () where they found that the query was issuing multiple requests to the database and occasionally returning the entire database table to memory and using LINQ to Objects over the result. I also found that Frans Bouma blogged about this back in 2008 at  . Consider the following LINQ query over Northwind:

Dim query = Aggregate o in Orders
                   into Sum(o.Freight),
                   Average(o.Freight),
                   Max(o.Freight)

This produces the following TSQL Statements in EF. Notice here that the Sum and Avg are performed on the server, but the Max pulls the entire table to memory and does Max on the client. It would seem that this is an issue in the expression tree parser.

SELECT
[GroupBy1].[A1] AS [C1]
FROM ( SELECT
   SUM([Extent1].[Freight]) AS [A1]
   FROM [dbo].[Orders] AS [Extent1]
)  AS [GroupBy1]

GO

SELECT
[GroupBy1].[A1] AS [C1]
FROM ( SELECT
   AVG([Extent1].[Freight]) AS [A1]
   FROM [dbo].[Orders] AS [Extent1]
)  AS [GroupBy1]

GO

SELECT
[Extent1].[OrderID] AS [OrderID],
[Extent1].[CustomerID] AS [CustomerID],
[Extent1].[EmployeeID] AS [EmployeeID],
[Extent1].[OrderDate] AS [OrderDate],
[Extent1].[RequiredDate] AS [RequiredDate],
[Extent1].[ShippedDate] AS [ShippedDate],
[Extent1].[ShipVia] AS [ShipVia],
[Extent1].[Freight] AS [Freight],
[Extent1].[ShipName] AS [ShipName],
[Extent1].[ShipAddress] AS [ShipAddress],
[Extent1].[ShipCity] AS [ShipCity],
[Extent1].[ShipRegion] AS [ShipRegion],
[Extent1].[ShipPostalCode] AS [ShipPostalCode],
[Extent1].[ShipCountry] AS [ShipCountry]
FROM [dbo].[Orders] AS [Extent1]

For comparison, here’s the queries issued from LINQ to SQL:

SELECT SUM([t0].[Freight]) AS [value]
FROM [Orders] AS [t0]
GO

SELECT AVG([t0].[Freight]) AS [value]
FROM [Orders] AS [t0]
GO

SELECT [t0].[OrderID], [t0].[CustomerID], [t0].[EmployeeID], [t0].[OrderDate], [t0].[RequiredDate], [t0].[ShippedDate], [t0].[ShipVia], [t0].[Freight], [t0].[ShipName], [t0].[ShipAddress], [t0].[ShipCity], [t0].[ShipRegion], [t0].[ShipPostalCode], [t0].[ShipCountry]
FROM [Orders] AS [t0]

Interestingly, if you use From instead of Aggregate, the expression tree parsers seem to be able to handle this better. For example, the original query could be re-written as follows:

Dim query = From o in Orders
                  Group By key = 0
                  into Sum(o.Freight),
                  Average(o.Freight),
                  Max(o.Freight)

This produces the following SQL (using LINQ to SQL):

SELECT SUM([t1].[Freight]) AS [Sum], AVG([t1].[Freight]) AS [Average], MAX([t1].[Freight]) AS [Max]
FROM (
    SELECT @p0 AS [value], [t0].[Freight]
    FROM [Orders] AS [t0]
    ) AS [t1]
GROUP BY [t1].[value]

For the time being at least, I have to agree with Frans that it is best to avoid using the Aggregate keyword in VB when accessing a database. I’ll update this if I hear any updates that are on the horizon.

Posted on - Comment
Categories: VB Dev Center - LINQ - Entity Framework -

Unit testing and SMTP

Over the past several years, I’ve become a fan of unit testing. I’m not a test-first/TDD zealot by any means, but have found the definite benefits of testing your custom business logic to assert that it does what you say it will. When going to a client who hasn’t done unit testing at all, I often find it hard enough to get them to start testing and don’t want to over burden their tests with mocking frameworks to abstract out some of the external dependencies. As a result, I have no qualms about incorporating both unit testing and integration testing into the mix here.

One of the things that can be tricky and time consuming to test is operations that send automatic emails via the SMTP server. If you send through a real server, the transmission can be a significant bottleneck from a performance standpoint, particularly if the server doesn’t actually exist. In the past, I’ve used local fake SMTP “servers” like the one at http://smtp4dev.codeplex.com/. With this, you simply run the server in the start tray and any time you send an email to your localhost port 25, it will show up there. In your web config, you can specify this quickly by adding the following node for your test project.

  <system.net>
    <mailSettings  >
      <smtp deliveryMethod="Network" >
        <network host="127.0.0.1" port="25" />
      </smtp>
    </mailSettings>
  </system.net>

An even better solution is to change the DeliveryMethod attribute to “SpecifiedPickupDirectory” and then specify a local path that you want new emails to appear in. From your unit test, you can then check that folder for the presence of the new email if you want to confirm that it was “sent” and open it to view the contents. The revised configuration is as follows:

  <system.net>
    <mailSettings  >
      <smtp deliveryMethod="SpecifiedPickupDirectory">
        <specifiedPickupDirectory pickupDirectoryLocation="d:\Temp\smtp"/>
      </smtp>
    </mailSettings>
  </system.net>

One additional item I want to point out is that if you follow the MSDN help for SpecifiedPickupDirectory you will currently get a warning if you use DeliveryMethod=”specifiedPickupDirectory” with a lowercase “s”. Fortunately, VS will ignore the case on that attribute value and work either way.

Posted on - Comment
Categories: VB Dev Center -

Async Await StateMachine in ILSpy

In my Async Programming in .Net talk I like to show what’s really happening under the covers with the new async/await keywords in VS 2012. To show this, I’m currently using ILSpy instead of Reflector because it is free. I noticed an “issue” tonight when trying to locate the MoveNext method of the state machine generated class, but couldn’t find it. To begin, let’s look at the code sample.

Module AsyncWorld
    Sub DoWorkAsync()
        PrintIt()
        Console.ReadLine()
    End Sub

    Public Async Sub PrintIt()
        Dim text = "Hello World"
        Await Task.Factory.StartNew(
            Sub()
                Threading.Thread.Sleep(2000)
                Console.WriteLine(text)
            End Sub)
        Console.WriteLine("Done")
    End Sub

End Module

Ignoring the potential code smell of an Async Sub (async void in C#), let’s take a look at what IL Spy gives us out of the box for the generated classes:

image

Here we can see that the generated closure is displayed with the contained Lambda, but the underlying state machine that manages the closure is missing. If you want to see the state machine, click on Tools – Options and uncheck the “Decompile async methods (async/await)” option. 

image

Once you deselect that, refresh your project to see the compiler generated state machine:

image

My, that’s a lot of code for our simple Console Writeline task. Can’t find it? It’s called in the line

taskAwaiter = Task.Factory.StartNew(New Action(Me.$VB$ResumableLocal_$VB$Closure_ClosureVariable_1$1._Lambda$__1)).GetAwaiter().

Remember the closure we saw above? Our code’s still in there. The state machine just lets us call this asynchronously trapping for exceptions as necessary. In most cases, you don’t need to worry about all of this compiler generated code. You should be aware of it though if you are writing code with lots of awaits in a single method, particularly in tight loops. Your performance will suffer in those cases. It is better to work more natively with the TPL Task objects with continuations than using await/async in those cases. For more information, I highly recommend that you check out Steven Toub’s Zen of Async session from Build 2011.

If you want to try this out more, download my async samples including comparative examples using Callbacks, APM, Tasks, Async/Await, and Reactive Extensions (Rx).

Posted on - Comment
Categories: VB Dev Center -

SignalR and Reactive Extensions are an Rx for server push notifications

Recently, I had the need to build a system where multiple clients needed to be notified as changes happened on the server. While watching Damian Edwards, Brad Wilson and Levi Broderick present on Async in ASP.Net during the AspConf I was introduced to the dynamic simplicity that SignalR brings to the table. I was even more intrigued by the fact that it integrates directly with IObservable and the Reactive Extensions. After using it for a week, I’m truly impressed by what they’ve done with this library. To give you an idea of what I mean, let’s take my ObservableSensor demo which generates random values and see how we can use SignalR to expose these values over a distributed client environment.

Reactive Extensions on the Server

To begin, let’s look at the server. Here we will use the Observable.Generate method to generate some random values with associated random categories and the timestamp when the value was generated:

Option Strict Off

Imports Microsoft.VisualBasic
Imports System.Reactive.Linq
Imports SignalR

Public Class ObservableSensor

    Public Sub New()
        Dim rand = New Random(Now.Millisecond)

        Dim Generator = Observable.Generate(Of Double, SensorData)(
            initialState:=0,
            condition:=Function(val) True,
            iterate:=Function(val) rand.NextDouble,
            resultSelector:=Function(val) New SensorData With 
                                          {
                                              .Time = Now,
                                              .Value = val * 20,
                                              .Category = (CInt(val * 4)).ToString()
                                          },
            timeSelector:=Function(val) TimeSpan.FromSeconds(val))

        Generator.Subscribe(Sub(value)
                                Dim context = GlobalHost.ConnectionManager.GetHubContext(Of ObservableSensorHub)()
                                context.Clients.Broadcast(value)
                            End Sub)
    End Sub
End Class

Public Class SensorData
    Public Property Time As DateTime
    Public Property Category As String
    Public Property Value As Double
End Class

If you recall, we discussed the Observable.Generate implementation last year. The new part here occurs in the Subscribe implementation.

SignalR across the tiers

In this case, we are going to “Broadcast” our newly created methods to anyone listening. In this case, we are publishing our notifications without having a direct connection to the SignalR hub. We can grab a hold of it using the GlobalHost ConnectionManager to get the hub in our AppDomain for a type of ObservableSensorHub. What is this Hub thing you may ask. Well, here is the implementation for the ObservableSensorHub:

Imports SignalR.Hubs

Public Class ObservableSensorHub
    Inherits Hub

End Class

In case you’re wondering, no I’m not missing code here. That’s the complete implementation. We’re just creating a strongly typed instance of the SignalR.Hubs.Hub type for the the ConnectionManager to work with. In this simple application, we’re just going to start generating values when the web application starts. In the Global.asax, add the following implementation:

    Sub Application_Start(ByVal sender As Object, ByVal e As EventArgs)
        Sensor = New ObservableSensor
    End Sub

 

At this point, we now have a server that can broadcast our random values over HTTP to any clients that wish to subscribe. Before moving to the client, I need to say a bit more about the Broadcast “Method” that we are calling on on the Hub’s Clients. If you look at the type definition of Clients, you will see that there is no Broadcast method. Clients is actually a Dynamic object. At run time, we are declaring that it has a method called Broadcast. The SignalR infrastructure then knows how to translate requests for an invocation method by that name into an HTTP message to be sent to any clients (serializing the results into a JSON object using Json.Net). Remember in Visual Basic, we enable the Dynamic functionality by specifying Option Strict Off at the top of our class definition.

Now how do we consume these messages? Let’s start with a console application. First, make sure that you’ve installed and added the references to the SignalR.Client library. The easiest way is to use NuGet. Rather than a bunch of text, let’s just jump to the code:

Option Strict Off

Imports SignalR.Client.Hubs
Imports System.Reactive.Linq
Imports Newtonsoft.Json.Linq

Module Module1

    Sub Main()
        Dim cn = New HubConnection("http://localhost:5687/")
        Dim sensor = cn.CreateProxy("observableSensorHub")

        sensor.On(Of SensorData)("broadcast", Sub(item) Console.WriteLine(item.Value))

        cn.Start().Wait()
        Console.ReadLine()

    End Sub

End Module

 

Here, we create a new HubConnection specifying the endpoint for our web application. SignalR does support self hosted servers if you want to use a Windows Service or other back end implementation. As long as the client can see the server over the network, you can wire it up. Second, we create a dynamic proxy specifying the hub type that we created on the server. Note here that the casing of the proxy is important and uses Camel casing even though the implementation on the server used Pascal casing. This is done to make the hubs and methods feel more natural to JavaScript clients.

Next, we specify how to handle the push notifications that the server sends. We do that using the .On method, specifying the type (SensorData) that the method should be deserialized back into. We then specify the name of the method (altering the case as mentioned above) that we are listening for along with the Action that should be invoked as each value is received. In this case, we’ll just output the value that we received to the console window.

RX in the Client

At this point we have Rx pushing messages to the client via SignalR. Let’s take it a step further and add some Rx goodness on the client side as well. In addition to the On method, the Proxy also supports an Observe method which turns the message pump into an IObservable of an array of Objects where each of the method parameters are contained in that array. Since our Broadcast method only has a single parameter of type SensorData, we will grab it by getting the first array element and calling the Json.Net .ToObject implementation to translate it back into our strongly typed object. From there, we work with it just as we would any other Observable sequence. For example, to output only the generated values for Category 1, we could use the following:

        Dim cn = New HubConnection("http://localhost:5687/")
        Dim sensor = cn.CreateProxy("observableSensorHub")

        Dim items = From item In sensor.Observe("broadcast")
                    Let instance = item(0).ToObject(Of SensorData)()
                    Where instance.Category = "1"
                    Select instance

        Using items.Subscribe(Sub(value) Console.WriteLine(value.Value))

            cn.Start().Wait()
            Console.ReadLine()

        End Using

In this case, we Start the connection inside of our subscription’s Using clause along with the Console.ReadLine. Once a key is pressed, the subscription is disposed freeing our resource.

One of the nice things is the flexibility that SignalR offers. Pretty much anything that speaks HTTP can consume these messages from our Rx Server. If we wanted to consume the same messages in a web client, we could use the following Javascript code:

      $(function () {
          // Proxy created on the fly
          var hub = $.connection.observableSensorHub;

          // Declare a function on the chat hub so the server can invoke it
          hub.Broadcast = function (value) {
              $('#values').append('&lt;li&gt;' + value.Time + ': ' + value.Value + '&lt;/li&gt;');
          };

          // Start the connection
          $.connection.hub.start();
      });

In the javascript, we connect to our server by referring to the $.connection.observableSensorHub. Notice the camel cased name here? That’s the SignalR translation in action again. We then specify the method handler for the dynamically invoked “Broadcast” method. Here we just add list item entries to the unordered list.

As I said above, I’m so impressed with SignalR so far. Don’t be surprised to see me post more about it in the future. For now however, if you want feel free to download this sample and try it out yourself. You will need to refresh the nuget packages to get it to run. Additionally, realize that the sample was built with RC builds of Visual Studio 2012, Rx 2.0, and SignalR. I don’t guarantee that the sample will continue to work once these are officially released. If you notice any issues or want to see more details, let me know what you Thinq below.

Posted on - Comment
Categories: VB Dev Center - Rx -

Windows 8 Live Tiles and LINQ to XML

When I started working with Windows 8 Metro development, I was disappointed to see that the native WinRT API’s relied on the older XmlDocument and XMLDom rather than the newer subset of XDocument/XElement and LINQ to XML. I suppose this was necessary because the older version was more natural for C++ and JavaScript options for Metro applications. If you’re like me and want to use LINQ to XML to work with XML in WinRT, all you need to do is pass the XML strings back and forth between your C#/VB code and the native WinRT methods.

Let’s consider the case of custom Tile Notifications. In order to set one, you first get the XML template from the TileUpdateManager. This returns an XmlDocument object with an XML template looking like this:

<tile>
  <visual>
    <binding template="TileWidePeekImage01">
      <text id="1"></text>
     </binding>
  </visual>
</tile>
In order to set the text, we need to locate the “text” element with the attribute id of 1 and set the value. While we could create the Xml from scratch, It might be safer to just set the node’s value and retain the rest of the XML. If you’ve been following my blog for any time, I find the LINQ to XML API’s to be much easier to use than the older XmlDom. Luckily, moving back and forth from XmlDom and XDocument is as simple as calling Parse on the XDocument passing in the .GetXml method of the XmlDocument. To push our changed XDocument back into the XmlDocument, we call LoadXml passing the String representation of the XDocument using ToString. Here’s the code to grab the TileSquareText04 template and set the text value to “Text Line 1”.
Dim tileXml = TileUpdateManager.GetTemplateContent(TileTemplateType.TileSquareText04)

' Transition to XDocument
Dim xTile = XDocument.Parse(tileXml.GetXml())
' Manipulate with LINQ to XML
xTile...<text>.FirstOrDefault(Function(node) node.@id = "1").SetValue("Text Line 1")

' Set XmlDocument
tileXml.LoadXml(xTile.ToString())
Dim notification = New TileNotification(tileXml)
TileUpdateManager.CreateTileUpdaterForApplication().Update(notification)

That’s all it takes. If you don’t like this template, there’s plenty more where this came from. MSDN has a great page showing each of the tile samples along with the template XML you need to use.

Posted on - Comment
Categories: Linq to XML - WinRT - VB Dev Center -

LINQ to Database Performance hints

Although I’ve been writing primarily about RX here for a while, I still dig into some good old LINQ to SQL / EF quite a bit. Along the way, I’m frequently finding more tidbits to throw in the toolbox and thought I’d share some observations from a recent performance tuning project. (I’ll avoid disclosing the client and will change the database model to Northwind for this discussion protect the guilty.)

In this project, I was brought in at the end of the project as things started going south to try to get it across the finish line. As with many projects, they waited until after the development was complete before they started to consider the performance impact of their code. This violates one of the principal tenants of performance testing:

RULE 1: Test early and often

Performance for larger scale apps needs to be considered from the beginning of the project and measured. If you don’t have an ongoing way of measuring the performance as the application progresses through the development lifecycle, you will find it difficult to pinpoint specific performance bottlenecks over time.

As an example, several years ago I met with core members of the Microsoft CLR team. They shared the fact that they compare performance on each night’s builds and if they find more than a couple microsecond deviations in the performance, the teams were required to identify the source of the performance degradation. When working with systems that have millions of lines of code, if the waited a week, it may become hard to impossible to identify the multiple places where performance has now dropped by 1/100th of a second. Over time these performance penalties build up.

I don’t mean to indicate that your Mom & Pop web site needs to live up to this stringent of a testing standard. Also, I’m not insinuating that you should prematurely over optimize your code, but try to make intelligent decisions and have ways of determining when your decisions negatively impact your performance up front.

Why do I bring this up in the context of LINQ to DB (SQL/EF)? If you’re not testing and monitoring your app, you may find that a nice abstraction layer that you added just killed your performance. In the case of the client’s application, they had nicely abstracted away the ability to load various database structures into a generic repository. They had separate methods to GetCustomerById, GetOrdersForCustomerId, GetOrderDetailsForOrderId, etc. They also had helper methods for validation including ValidateIsCustomerNameUnique. The downside here is that reading through the code, it wasn’t easy to notice where hidden database calls were being sprinkled through the code. This brings up rule number 2 for performance testing with databases:

RULE 2: Profile your application

Profiling your application for external requests, including services is essential to make sure the requests you are making are not causing a performance bottleneck. I highly recommend using some sort of profiling tool, particularly for people who are new to LINQ. LINQ makes it easy to make silly mistakes like n+1 requests to a database when navigating lazy-loaded parent-child relationships.

There are plenty of options for profiling applications. I identified some of them back in my LINQ Tools post. Some of these require adding code to your assemblies, while others simply attach to running processes.

If you have a license for SQL Server, you probably have access to SQL Profiler. To use this tool, create a new profile pointing it to your database and run your application. Once you have executed the code you want to profile, you can view all of the requests made against that database. Pay particular attention to cases where the same SQL is issued multiple times in succession or sequence. The downside here is that you will need to manually trace through your code base to find where each request is made in order to fix the issues.

Another alternative if you have the Ultimate SKU of Visual Studio is to use the Intellitrace feature added in Visual Studio 2010. Using Intellitrace, you can not only identify which SQL requests were issued to the database. You can also select each line in the results and navigate to the line of code that caused the request to be issued. Let’s consider this fairly innocuous code snippet to output the Order date and Product Names associated with each order:

   1:  Using model As New NwindEntities
   2:      For Each o In model.Orders.Take(3)
   3:          Console.WriteLine(o.OrderDate)
   4:          For Each od In o.Order_Details
   5:              Console.WriteLine(od.Product.ProductName)
   6:          Next
   7:      Next
   8:  End Using

Running this and putting a breakpoint on line 8, we can see the following Intellitrace output:

image

From here, we can see that as we iterate each order item, we are making a separate request to the database  for each associated order detail record to get the related product object. Clicking on any of the lines of the output will take you directly to the line of the code that the request was made. Unfortunately, you won’t be able to view the parameters for each request using Intellitrace as you could with SQL Profiler, but that’s less important from a performance tuning perspective than it is to know that you have multiple excessive requests to the database. If you find that the intellitrace output becomes cluttered with non-database related trace items, I often find it helpful to filter the results to only ADO.Net requests:

image

Fortunately, fixing the code above only requires changing line 2 in the code above to eagerly load the child records:

For Each o In model.Orders.Include("Order_Details").Include("Order_Details.Product").Take(3)

We may want to further optimize this request by not hydrating the entire objects, but rather just fetching the rows that we want as well:

Using model As New NwindEntities
    Dim query = From o In model.Orders
                Select o.OrderDate,
                    ProductNames = From od In o.Order_Details
                                   Select od.Product.ProductName

    For Each o In query.Take(3)
        Console.WriteLine(o.OrderDate)
        For Each p In o.ProductNames
           Console.WriteLine(p)
        Next
    Next
End Using

If you don’t have access to SQL Profiler or Intellitrace, consider one of the other relatively inexpensive profiling tools out there like the MVC MiniProfiler, ORM ProfilerHuagati’s LINQ to SQL Profiler, EF Prof, or at a bare minimum checking the generated code for your queries using LinqPad.

With your application now profiled, hopefully you won’t run into the issues such as found in the following code (which I did see in the customer’s app changing the model to again protect the guilty:

Debug.WriteLine(Customers.FirstOrDefault().CompanyName)
Debug.WriteLine(Customers.FirstOrDefault().ContactName)
Debug.WriteLine(Customers.FirstOrDefault().ContactTitle)
Debug.WriteLine(Customers.FirstOrDefault().Phone)
Debug.WriteLine(Customers.FirstOrDefault().Fax)
Dim CustomerOrders = Customers.FirstOrDefault().Orders
For Each order in CustomerOrders
    ' Load the order into a POCO
Next

Quick quiz boys and girls, without using a profiler how many requests are we making to our database here? Remember that not only will you have a database hit for each call to FirstOrDefault which doesn’t use deferred loading, but you’ll also get a call on GetEnumerator (called internally in the For Each iteration). Thus the count is 7 for the above code, right? Actually, it’s worse because hidden behind the Debug.WriteLine is a trace writer which also writes the value to the database’s log table. As a result we actually have 12 database requests (7 reads and 5 writes) instead of the single query that we should have used. In this case we’re breaking rule #3:

RULE 3: Don’t fetch needlessly

In the above code, we can simply fetch our target customer once and automatically include their associated Orders as follows:

Dim firstCustomer = Customers.Include("Orders").FirstOrDefault()
Debug.WriteLine(firstCustomer.CompanyName)
Debug.WriteLine(firstCustomer.ContactName)
Debug.WriteLine(firstCustomer.ContactTitle)
Debug.WriteLine(firstCustomer.Phone)
Debug.WriteLine(firstCustomer.Fax)
Dim CustomerOrders = firstCustomer.Orders
For Each order in CustomerOrders
    ' Load the order into a POCO
Next

In this case we fetch the firstCustomer once and reuse it rather than calling FirstOrDefault repeatedly. We also use the .Include option to eagerly fetch the child records. In this case, I’m saddened that the original developer didn’t use a bit of brain power to eliminate those extra database hits FOR LOGGING PURPOSES, but assume that it was because they weren’t aware of when their database was being hit. Of course, this brings us back to rule 2 – profiling.

Simply removing excessive database hits will almost always improve your code performance. In one case, I had a situation where a request was taking 10 minutes. After removing the excessive hits, the performance came down to 10 seconds, which was a definite improvement. However, 10 seconds still does not make the application web scale as the database’s CPU is pegged for that amount of time. Sometimes, it is actually best to break the process up a bit to improve performance.

RULE 4: Break up complex queries

To give you an idea of the kind of query that we’re talking about here, consider the following case where we are fetching the employees in USA including their regions and sales information.

Dim empOrders = 
    From e In Employees
    Where e.Country = "USA"
    Select 
        e.FirstName,
        e.LastName,
        Regions = From t In e.Territories
                  Select t.Region.RegionDescription
                  Distinct,
        TopThreeSales = From o In e.Orders
                        From od In o.OrderDetails
                        Select
                            od.Product.ProductName,
                            TotalSale = od.Quantity * od.UnitPrice
                        Order By TotalSale Descending

While this code will compile and run, the performance will start to suffer as larger volumes of data are added to the database. The reason is that both Territories and Orders are child collections from Employees. As a result, SQL will return the Cartesian product between the two sets. In other words, if a single employee is associated with 5 territories and has sold 10 products, the total number of rows returned would be 50. The OR/M is then responsible for splitting out those results again into the correct groupings. If you multiply this against 10,000 employees, you will find that there is a massive amount of excess data that is returned.

In this case, It may be helpful to split your sub queries up into separate database requests. You could do something like the following using the Contains clause to pass the Id’s of the parent record in to the subsequent  queries.

Dim empOnly = 
    (From e In Employees 
    Where e.Country = "USA" 
    Select 
        e.EmployeeID, 
        e.FirstName, 
        e.LastName). 
    ToList()

Dim EmpIds = empOnly.Select(Function(e) e.EmployeeID)

Dim TopThreeSales = 
    From o In Orders
    Where EmpIds.Contains(o.EmployeeID)
    From od In o.OrderDetails
    Select 
        o.EmployeeID,
        od.Product.ProductName,
        TotalSale = od.Quantity * od.UnitPrice
    Order By TotalSale Descending

However “Contains” has a couple of hidden surprises that limit our use here. First if you use Contains, you can not make it into a compiled query because the number of parameters vary at run-time. Second, if you have more than 2100 items in your EmpIds collection, you will run into a hard limit in SQL Server which only allows up to 2100 parameters. In this event, it is better to re-apply the original filter and return the new result sets for each of the subsequent queries. In the end we can join the separate queries back together again using LINQ to Objects:

Dim empOnly = 
    (From e In Employees
    Where e.Country = "USA"
    Select 
        e.EmployeeID,
        e.FirstName,
        e.LastName).
    ToList()

Dim Regions = 
    (From e In Employees 
    Where e.Country = "USA"
    From t In e.Territories
    Select 
        e.EmployeeID,
        t.Region.RegionDescription
    Distinct).
    ToList()

Dim TopThreeSales = 
    (From e In Employees 
    Where e.Country = "USA"
    From o In e.Orders
    From od In o.OrderDetails
    Select 
        o.EmployeeID,
        od.Product.ProductName,
        TotalSale = od.Quantity * od.UnitPrice
    Order By TotalSale Descending).
    ToList()
                
Dim combined = 
    From e In empOnly
    Select 
        e.FirstName,
        e.LastName,
        EmpRegions = Regions.Where(Function(reg) e.EmployeeID = reg.EmployeeID),
        Sales =  TopThreeSales.Where(Function(sale) e.EmployeeID = sale.EmployeeID)
    

In the above referenced project, I was able to take some queries that as a single LINQ statement took 10 seconds to run on the database down to sub-second requests. Your mileage may vary using this technique. If at all unsure, refer back to Rule 2: Profile.

RULE 5: Use ORM by default but Stored Proc where necessary

At times you will find that the generated query is too complex and the database has issues trying to parse the generated SQL. The complexity of your query, particularly the number of joins and depth of the object graph/inheritance model you are traversing can cause issues. In these cases, I have no objections to using Stored Procedures in order to wrangle the otherwise unruly queries. LINQ to DB is great for the 70-80% of crud operations, but there are times, particularly when reporting when you need something else. Thankfully LINQ to SQL and LINQ to EF both support consuming stored procedures when the need arises without the need to write the tedious and potentially error prone custom ADO code yourself.

In addition, LINQ is not a replacement for ETL operations. In one case, we had a situation where saving a single record with 2000 children caused the database to churn for 10 minutes due to 2001 individual hits to the database for the update process (on a single SaveChanges call). We re-wrote that operation using BulkCopy and brought the operation down to 1 second from 10 minutes.

RULE 6: Use Appropriate Sample Data

The last item for today is to make sure when developing to have representative quantities of data in your sample as you will have after a couple of years of production data enters your system. We found a number of cases where complex queries like I described in rule 4 above would perform fine in development where we had thousands of rows of data, but when we applied it against the production data which had millions of rows of data, the performance died. The complex joins which worked fine against smaller data sets no longer worked against the bigger sets. If we had a good approximation of data volumes in development, we would have been able to diagnose and fix this issue before shipping the version to production.

That’s just a couple of my rules of thumb which have helped me diagnose and fix performance issues with LINQ to SQL/EF. If you have any tricks or techniques to add, please let me know what you Thinq.

Posted on - Comment
Categories: LINQ - VB Dev Center - Entity Framework -

Using Rx to consume a Task based WCF service

Among the many changes that Dev 11 brings is the new default when adding a service reference to generate Task based proxy methods rather than using the APM flavor (using the BeginXXX – EndXXX model). In this post, we’ll look at creating a simple service and then consuming it using the Reactive Extensions. Let’s start by defining the service interface and implementation:

Imports System.ServiceModel
Imports System.Threading 

<ServiceContract()>
Public Interface ISimpleServicesvc

    <OperationContract()>
    Function DoSomethingCool(input As String) As String

End Interface

Public Class SimpleServicesvc
    Implements ISimpleServicesvc

    Public Function DoSomethingCool(input As String) As String Implements ISimpleServicesvc.DoSomethingCool
        Return (String.Join("", From letter In input.ToCharArray()
               Order By letter
               Distinct))
    End Function

End Class

Essentially here we are just taking a string input and returning the distinct characters sorted. The details of the service in this case are trivial. Our focus here is how to implement the service client. We start by adding a service reference in our client application by right clicking on the project and selecting Add Service Reference. (Alternatively, you can now press Ctrl-Q and request to “Add Service Reference” from there. From the dialog, you can still use the “Discover” button to locate the service as long as it is in your solution.

image

One thing to note is that the proxy classes are now by default generated using Task based methods rather than the previous IAsyncResult AMP method.

image

As a result, the definition of the proxy class is as follows:

Public Function DoSomethingCoolAsync(ByVal input As String) 
                 As System.Threading.Tasks.Task(Of String) 
                 Implements SimpleService.ISimpleServicesvc.DoSomethingCoolAsync
    Return MyBase.Channel.DoSomethingCoolAsync(input)
End Function

If we wanted to consume this using the new Async/Await, we could do it as follows:

Private Async Sub SubmitClicked() Handles SubmitButton.Click
   Dim svc = New SimpleService.SimpleServicesvcClient()
   Dim req = Await svc.DoSomethingCoolAsync(InputText.Text)
   OutputText.Text = req
End Sub

Of course, to put the LINQ spin on this, let’s see the Rx version to do the same thing:

Private Async Sub SubmitClicked() Handles SubmitButton.Click
    Dim svc = New SimpleService.SimpleServicesvcClient()
    Dim req = svc.DoSomethingCoolAsync(InputText.Text).ToObservable()
    req.ObserveOnDispatcher().Subscribe(Sub(val) OutputText.Text = val)
End Sub

We start by turning the Task into an Observable producer using the ToObservable extension method. We then subscribe to the observable making sure to return back to the dispatcher thread because the task based service is run on a taskpool thread. Of course in this case, we are subscribing on every button click. With Rx, we could wire the button click and service request up on form navigate and unwire it when navigating from the form as follows:

Private requestDisposable As IDisposable
Protected Overrides Sub OnNavigatedTo(e As Navigation.NavigationEventArgs)
    Dim svc = New SimpleService.SimpleServicesvcClient

    requestDisposable = (From click In Observable.FromEventPattern(Of RoutedEventArgs)(SubmitButton, "Click")
                        From req In svc.DoSomethingCoolAsync(InputText.Text).ToObservable()
                        Select req).
                        ObserveOnDispatcher().
                        Subscribe(Sub(val) OutputText.Text = val)
End Sub
Protected Overrides Sub OnNavigatedFrom(e As Navigation.NavigationEventArgs)
    MyBase.OnNavigatedFrom(e)
    requestDisposable.Dispose()
    requestDisposable = Nothing
End Sub
Posted on - Comment
Categories: WCF - Rx - VB Dev Center -

WinRT Contacts

In preparation for recording a session of Deep Fried Bytes on the WinRT Namespaces, I scanned through the public ones in the Developer Preview to see what I could find. One of the interesting ones that I hadn’t heard about yet is the Contact class in the Windows.ApplicationModel namespace. This appears to be preparing for an integrated contact experience similar to the one found on the people hub on the Windows Phone. 

Similar to accessing files, music, pictures, etc, for contacts, you access them through the ContactPicker object. It has two main methods: PickSingleContactAsync and PickMultipleContactsAsync. Unfortunately, it appears that there is a bug in the PickMultipleContacts implementation making it unusable from type-safe languages at the moment. The WinRT samples do include a working version in JavaScript if you want to go that route.

The ContactInformation object supports the following properties: Name, Locations (addresses), Emails, PhoneNumbers, InstantMessages, and CustomFields. Custom Fields support Name, Value, Category and Type. With this, you can handle a wide variety of information about your friends.

I was able to create some sample code to get a contact:

Dim contPicker As New Windows.ApplicationModel.Contacts.ContactPicker
Dim contact = Await contPicker.PickSingleContactAsync()
Dim contactName = contact.Name

Unfortunately, when running this code on the current bits, the following message appears.

image

I guess we’ll have to wait until the store opens and contact apps start appearing to use it, or will we. If you open the Package.AppManifest and navigate to the Declarations, you can add the Contact Picker declaration to your package and your own search page and algorithm to search your proprietary store similar to how you can enable your application to search files from Facebook, Skydrive, Flicker, etc.

image

Posted on - Comment
Categories: VB Dev Center -

Windows 8 Hands On Labs available

build_logo_smallAt Build, we had the opportunity to preview a number of Hands On Labs for development and managing Metro styled applications. Unfortunately, they were not available to non-build attendees and we weren’t able to bring them with us from the conference.

Thankfully, the hands on labs are now available for everyone. You can download them now from the Build website. I tried a couple of them at the conference and the ones I tried appeared to be fairly well done. Labs are available in C#, VB, C++ and JavaScript. If you’re looking to learn the preferred way of doing standard tasks, I recommend trying them out. The standard caveats apply. These are initial labs for a pre-beta operating system. Both the labs and the platform are likely to change as Windows continues to evolve. That’s the price you get for living on the bleeding edge, occasionally you will get cut.

Update: In addition to the Build Hands On Labs, Microsoft has also released the Visual Studio 11 Developer Preview Training Kit to help you understand the new features of Dev 11 including more hands on labs.

Posted on - Comment
Categories: VB Dev Center -

Cancelling a Reactive Extensions Observable

I’m often asked how do you cancel an Observable. In previous posts, I’ve shown how to stop observing by disposing the result of the subscription to the Observable. Of course, this isn’t the only way. If you are using one of the Observable.Generate methods, you can pass in a pointer to an object (like the System.Threading.CancellationTokenSource) and change a flag on that object, then in the iterate lambda function, reference that object to see if the cancellation flag was set. Here’s an example:


    Private WithEvents ts As New System.Threading.CancellationTokenSource

    Protected Overrides Sub OnInitialized(e As System.EventArgs)
        MyBase.OnInitialized(e)

        Dim items As New ObservableCollection(Of Integer)
        ItemsListbox.ItemsSource = items

        Dim cancelClicked = Observable.FromEventPattern(Of RoutedEventArgs)(Cancel, "Click")

        Dim obs = Observable.Generate(0,
                                      Function(x) Not ts.IsCancellationRequested,
                                      Function(index) index + 1,
                                      Function(index) index,
                                      Function(index) TimeSpan.FromSeconds(1))

        obs.TakeUntil(cancelClicked).
            ObserveOnDispatcher().
            Subscribe(Sub(item)
                          items.Add(item)
                      End Sub)
    End Sub

    Private Sub Cancel_Click(sender As System.Object, e As System.Windows.RoutedEventArgs) Handles Cancel.Click
        ts.Cancel()
    End Sub

In this example, we’re setting up a class level variable (ts) as a CancellationTokenSource. When we create our observable using Generate, the second parameter evaluates whether or not it should continue iterating. By checking the ts.IsCancellationRequested, we will evaluate that each time we iterate. Because it is a module level variable, we can cancel it by calling Cancel() in the Cancel Button click event handler.

As another alternative, we can convert the Cancel click into an observable collection as well by using the Observable.FromEventPattern. Then on the main observable, join it with the button observable using TakeUntil as follows:


    Protected Overrides Sub OnInitialized(e As System.EventArgs)
        MyBase.OnInitialized(e)

        Dim items As New ObservableCollection(Of Integer)
        ItemsListbox.ItemsSource = items

        Dim cancelClicked = Observable.FromEventPattern(Of RoutedEventArgs)(Cancel, "Click")

        Dim obs = Observable.Generate(0,
                                      Function(x) True,
                                      Function(index) index + 1,
                                      Function(index) index,
                                      Function(index) TimeSpan.FromSeconds(1))

        obs.TakeUntil(cancelClicked).
            ObserveOnDispatcher().
            Subscribe(Sub(item)
                          items.Add(item)
                      End Sub)
    End Sub

Do you have another favorite alternative for cancelling an Observable, let me know.

Posted on - Comment
Categories: Rx - VB Dev Center -

Revising the Reactive Sensor Generator

When first created the Reactive sensor sample I wasn't completely happy with it because if any of the various subscribers were disposed or triggered an OnCompleted (like the Any) clause, it would trigger a completed state to all of the other “listeners”. This is not what I intended.  To make it easy, let’s review how we created the sensor originally:


    Private _observers As New List(Of IObserver(Of SensorInfo))
    Private _running As Boolean

    Public Function Subscribe(ByVal observer As System.IObserver(Of SensorInfo)) 
                              As System.IDisposable 
                              Implements System.IObservable(Of SensorInfo).Subscribe
        _observers.Add(observer)
        Return Me
    End Function

    Public Sub StartSensor()
        If Not _running Then
            _running = True
            Dim randomizer = New Random(Date.Now.Millisecond)
            While _running
                Dim randVal = randomizer.NextDouble
                If _observers.Any Then
                    Dim info As New SensorInfo With {.SensorType = CInt(randVal * 4).ToString,
                                                     .SensorValue = randVal * 20,
                                                     .TimeStamp = Now}

                    _observers.ForEach(Sub(o) o.OnNext(info))
                End If
                Threading.Thread.Sleep(CInt(randomizer.NextDouble * 500))
            End While
        End If
    End Sub

For a quick review, we set up an internal list of observers and manually added new subscribers to this list. We didn’t have a good way of removing them from the list however through the typical dispose implementation. Then as we are looping in the while loop, we generate a new sensor reading and announce it to each of the listeners through the _observers.ForEach(0 => o.OnNext(info)). Because this was constantly running (originally on the calling thread), we had to take the extra effort to run it on a background worker.

With more reflection and further work with Reactive, we can simplify the process quite a bit here by using the Observable.Generate (or in earlier builds Observable.GenerateWithTime) to create an observable that we expose to our subscribers. The subscriber is then responsible for subscribing and disposing itself from our centralized observable. Our revised implementation to generate random sensor readings at random time intervals is as follows:


Public Class Sensor

    Private _sensorObservable As IObservable(Of SensorInfo)

    Public Sub New()
        _sensorObservable =
            Observable.Generate(Of Int16, SensorInfo)(initialState:=0,
                condition:=Function(x) True,
                iterate:=Function(inVal) inVal,
                resultSelector:=Function(x)
                                    Dim randValue = New Random(Date.Now.Millisecond).NextDouble
                                    Return New SensorInfo With {.SensorType = CInt(randValue * 4).ToString,
                                                                .SensorValue = randValue * 20,
                                                                .TimeStamp = Now}
                                End Function,
                timeSelector:=Function(x) TimeSpan.FromMilliseconds(New Random(Date.Now.Millisecond).NextDouble * 100)
            )
    End Sub

    Public ReadOnly Property SensorObservable As IObservable(Of SensorInfo)
        Get
            Return _sensorObservable
        End Get
    End Property

End Class

In new implementation we use the Obserable.Generate that we discussed when mocking the phone accelerometer. We do still need to worry about our threads because the timeSelector runs on a background thread automatically. As a result, our observables are being generated on a background thread.

When we subscribe to this, we first create a shared instance of the Sensor class and then Rx queries subscribe to the sensor’s SensorObservable property:


Dim AnySensor = sensor.SensorObservable.Any(
                  Function(s) s.SensorValue > 17)
AnySensor.Subscribe(Sub(s) MessageBox.Show(s.ToString, "OutOfRange"))

If you want to remove a subscription, keep a handle on the disposable that was returned when you subscribed and dispose it to stop listening.


Private ActiveLowValueSensors As IDisposable

Private Sub FilterLowValue_Click() Handles FilterLowValue.Click
    If ActiveLowValueSensors Is Nothing Then
        Dim lowValueSensors = From s In sensor.SensorObservable
                              Where s.SensorValue < 3
                              Select s.SensorValue

        ActiveLowValueSensors = lowValueSensors.Subscribe(
           Function(val) Console.WriteLine(val))
    Else
        ActiveLowValueSensors.Dispose()
        ActiveLowValueSensors = Nothing
    End If

End Sub

I've updated the WPF samples with this change so you can take it out for a spin if you would like. As always, let me know what you Thinq.

Posted on - Comment
Categories: Rx - VB Dev Center -

Updating Reactive Samples to 10425 build

Today, I decided to take the plunge and update my WPF and Silverlight Reactive Extensions samples to the latest (at the time of this writing) build of Rx: version 1.1.10425.0. At this point, I have purposely left the phone samples on the blog targeting the original shipping bits, so they aren’t affected at this point.

(ED: the 1.0.10605 stable and experimental releases shipped as of 6/5/2011. Looking at the release notes, this update appears to have much less breaking changes beyond those in the 1.1.10425 build).

As discussed in the Rx Forum post, this update has quite a number of breaking changes. Lee Campbell has a nice post discussing a number of these changes. If you don’t care about the changes and just want to download the revised samples, head on over the Files page and try them out:

  • RX_Wpf
    Reactive Framework samples from the "Becoming an RxPusher with the Reactive Framework" talk. With the emergence of LINQ, we discovered the power and flexibility that comes from the IEnumerable interface. This pull model makes iterating over sets of data and performing filtering, transformation, and aggregation operations easy through LINQ. However, the pull model breaks down in asynchronous and event driven environments. In evaluating the options, we discovered that the IObserverable interface and the push model were effectively analogous to the pull model of IEnumerable. As a result, we can make event driven asynchronous programming easier and more declarative by using the Reactive Framework and LINQ to Events.

    (Uploaded on 6/8/2011 - File Size 183442)

  • RX_Silverlight
    Slides and demos for using the Reactive Framework in Silverlight 4.

To help you update your existing projects, I figured I would share a bit of my experience with this update.

First of all, remove the existing references to the old builds of RX, including System.CoreEx, System.Interactive, and System.Reactive. Next download the new build either by downloading it directly from the MSDN Data Development center, or by using the Nuget Package Manager and installing the Rx-Main package. You do want to be careful here, because the older 1.0.2856.0 build is still available via Nuget as well. There are some cases where you might need access to the older build, particularly in cases where you need access to the IEnumerable extenstions that were added to complement the new ones in IObservable, like .Do. Also, note that the RxJs library is not currently updated to agree with the new method names, so you will need the older packages for RxJs support as well.

Once you have the new package downloaded and installed, add a reference to the System.Reactive assembly in your solution.

You won’t need to remove imports or using clauses from your classes because the older builds of Rx built the extensions directly on top of the System.Linq namespace. In the new builds, the LINQ extensions are now in the System.Reactive.Linq namespace, so you will need to add an Imports/using clause to System.Reactive.Linq. If you used the ObserveOnDispatcher/SubscribeOnDispatcher methods, these have been replaced by the simpler ObserveOn and SubscribeOn. As a result, you may want to add an Import/using clause for System.Threading as well so that we can access the SynchronizationContext more directly. With that in place, change your calls to .ObserveOnDispatcher as follows:


ObserveOn(SynchronizationContext.Current)

Next, we need to update some of the Observable factory implementations that we have used to create observables. The Observable.FromEvent method has changed to more closely align with Observable.FromAsyncPattern and now uses “FromEventPattern”. For example, the Mouse drag-drop example that we discussed in this post, now starts out as follows:


Dim mouseDown = From evt In Observable.FromEventPattern(Of MouseButtonEventArgs)(image, "MouseDown")
                Select evt.EventArgs.GetPosition(image)
Dim mouseUp = Observable.FromEventPattern(Of MouseButtonEventArgs)(image, "MouseUp")
Dim mouseMove = From evt In Observable.FromEventPattern(Of MouseEventArgs)(image, "MouseMove")
                Select evt.EventArgs.GetPosition(Me)

Similarly, the Observable.GenerateWithTime method has now been refined to the more generalized Observable.Generate and uses a TimeSpan override to specify the time interval.

Another such simplification was made to the Buffering and Windowing operations that we used in this post, including BufferWithTime, BufferWithCount, WindowWithTime and WindowWithCount. Now we just have a common Buffer operator that returns an IObservable<IList<T>> and Window which returns an IObservable<IObservable<T>> implementation. We can use the overload resolution to determine if we are passing in a timespan or integer and use the appropiate flavor of Buffer and Window as necessary. As a result, we can change our sorting code to the following:


Dim segmented = Sensor.Buffer(TimeSpan.FromSeconds(3))
segmented.Subscribe(Sub(val)
FilteredList.ItemsSource = From v In val 
                           Order By v.SensorValue) 

One last item that I needed to change was to restore the Do extension method on IEnumerable because I have come to love that function. Fortunately, implementing Do is relatively easy. Unfortunately, the implementation relies on iterators, so we need to put that in a C# project. Here’s the definition of the new Do extension method:


public static class IEnumerableEx
{
   public static IEnumerable<T> Do<T>(this IEnumerable<T> source, Action<T> action)
   {
        foreach (T item in source)
        {
            action(item);
            yield return item;
        }
    }
}

I’m sure you may run into other issues when updating your projects due to the breaking changes. Refer to the forum post above if you need help on your particular issue.

Posted on - Comment
Categories: VB Dev Center - Rx -

Using RX to detect shake Gestures

Part of the power of RX lies in it’s ability to compose complex operations  and keep the resulting code maintainable. I previously showed how to perform Drag-Drop operations with RX. This time, I want to take a look at a slightly more complex operation: Detecting “Shake” gestures on the Windows Phone 7.

The phone includes the ability to detect motion in 3D space through the built-in Accelerometer in the Microsoft.Devices.Sensors library. This sensor raises events when the phone is moved with information about how forcefully it was moved in the EventArgs. Detecting shakes is more complex than just knowing if the device was moved. We need to make sure that the user’s motion was aggressive enough to warrant a shake detection.

In addition, we need to know if the user moved the phone aggressively enough multiple times within a small enough time span. Simply monitoring the ReadingChanged event doesn’t fill the needs of detecting a real “Shake”. To manage all of these state changes and the times that each change occurs with traditional imperative code, we would either need to set up a number of queues remembering each motion that exceeds the tolerance and the times each happens and then act upon them when a sufficient number of these movements happen within a given time threshold. GoogleBinging this finds a number of sample implementations including Joel Johnson’s article and the recently released Shake Gesture Library. Both of these versions work with traditional events and manage the state internally.

If we use RX, we can simplify the code a bit by taking advantage of Observable.FromEvent to create an observable collection from the Accelerometer.ReadingChanged event, and the TimeInterval method to track the amount of time that passes between each accelerometer reading that exceeds the given tolerance (MinimumOffset).


Imports System.Linq
Imports Microsoft.Devices.Sensors
Imports Microsoft.Phone.Reactive

Public Module ShakeObserver
    Const MinimumOffset = 1.44
    Const TimeThreshold = 200

    Public Function GetObserver(ByVal accel As Accelerometer) As IObservable(Of IEvent(Of AccelerometerReadingEventArgs))

        Dim readingChangedObservable = Observable.FromEvent(Of AccelerometerReadingEventArgs)(accel, "ReadingChanged")

        Dim query = From knocks In
                    (From startEvent In readingChangedObservable
                     Where (startEvent.EventArgs.X ^ 2 + startEvent.EventArgs.Y ^ 2) > MinimumOffset).
                    TimeInterval
                    Where knocks.Interval.TotalMilliseconds < TimeThreshold
                    Select knocks.Value

        Return query
    End Function
End Module

We can then consume this ShakeObserver in client code as we would any other Observable collection.


Dim accel As New Accelerometer
accel.Start
Dim query = From shake in GetObserver(accel)
            Select shake

query.Subscribe(Sub(_) DoSomething())

Of course, if we are composing even more complex interactions, the power of using Observables here would be even greater as that’s where RX truly shines.

Posted on - Comment
Categories: Rx - VB Dev Center - WP7 -

Create an RSS Feed for PDC 2010 videos

I love the fact that Microsoft makes it’s conference materials available for those unfortunate enough not to be able to attend. I also love watching the videos on my Zune. Even better is when I can use the Zune podcasting ability to download these videos. So far, I wasn’t able to find such a feed. Thankfully, fellow MVP, Bill McCarthy posted some quick LINQ to XML code to generate HTML tables based on the Xessions XML that was used for PDC. You can read his post here: http://msmvps.com/blogs/bill/archive/2010/11/03/pdc-2010-sessions.aspx.

To take this a step further, I modified his code to generate a quick RSS feed that I can use in the Zune software to download them as if they were a podcast. Here’s the revised code:

Dim doc = XDocument.Load("http://videoak.microsoftpdc.com/pdc_schedule/Schedule.xml")
        Response.Write(<?xml version="1.0" encoding="UTF-8"?>
                       <rss version="2.0">
                           <channel>
                               <title>PDC Videos</title>
                               <link>http://www.Microsoftpdc.com</link>
                               <description>Download content files for PDC 2010.</description>
                               <generator>LINQ</generator>
                               <%= From session In doc...<Session>
                                   From content In session...<Content>
                                   Where content.@Url.EndsWith("Low.wmv")
                                   Select <item>
                                              <title><%= session.<ShortTitle>.Value & " - " & content.@Title %></title>
                                              <link><%= session.@ShortUrl %></link>
                                              <enclosure url=<%= content.@Url %>/>
                                          </item> %>
                           </channel>
                       </rss>)

 

Note: This code does require VB10. If you want to do it with VB9, just add the line continuators (_).

Posted on - Comment
Categories: Linq to XML - VB Dev Center -

Reactive Extensions responding to UI events

One of the great things about the Reactive Extensions is that they allow you to express rather complex interactions simply. For this example, we’ll consider the mouse drag drop operation in Silverlight. Note: The identical code works in both the web based Silverlight and the Windows 7 phone. If you want to download the phone version of this code, it is available in C# or VB.

One of the indicators of simplicity is if you can explain a concept to your mother. How would you explain drag-drop to your mother?  It may go something like this:

  • Record the start location when you press the mouse button down.
  • While you are moving the mouse, record the end location until you let the mouse button up.
  • Calculate the difference between the start and end locations and move the image you dragged accordingly.

Now, let’s see how we can do this in code. The beauty of the Reactive programming model is that we can compose multiple expressions together in a concise manner. Here’s how we accomplish the above process:


Dim q = From startLocation In mouseDown
        From endLocation In mouseMove.TakeUntil(mouseUp)
        Select New With {
            .X = endLocation.X - startLocation.X,
            .Y = endLocation.Y - startLocation.Y
        }

As you can see, this code reads nearly the same as the description you gave to your mother. It is actually a bit more concise than the slightly more verbose description.

At this point we have a couple details to wrap up. First, we need to declare the variables for mouseDown, mouseMove and mouseUp that we used above. We do this by using RX to subscribe to the events as discussed in this post.


Dim mouseDown = From evt In Observable.FromEvent(Of MouseButtonEventArgs)(image, "MouseLeftButtonDown")
                Select evt.EventArgs.GetPosition(image)
Dim mouseUp = Observable.FromEvent(Of MouseButtonEventArgs)(Me, "MouseLeftButtonUp")
Dim mouseMove = From evt In Observable.FromEvent(Of MouseEventArgs)(Me, "MouseMove")
                Select evt.EventArgs.GetPosition(Me)

While we’re here, notice that we are grabbing the MouseLeftButtonDown event of the image. However, we track the mouseMove and mouseUp on the form itself. We could use the MouseMove and MouseLeftButtonUp events of the image, but if we try to move too fast, the time Silverlight takes to calculate that the mouse is moving on the image rather than the canvas can mean that your movement is detected off of the image before you’ve been able to move the image. Tracking on the form itself drastically increases performance and reduces the possibility that you will move off of the image too soon.

The last thing we need to do is to move the image to the new location. In this sample, we placed the image on a Canvas. We just need to use the distance we recorded in our query and subscribe to the observable with an action that moves the image:


q.Subscribe(Sub(value)
                Canvas.SetLeft(image, value.X)
                Canvas.SetTop(image, value.Y)
            End Sub)

If you want to see this code in action, the VB version is available in the WPF samples and WP7 samples on the download page. The C# sample is in the Silverlight RX samples and WP7 samples also on the download page.

Posted on - Comment
Categories: VB Dev Center - Rx -

Reactive Extensions Phone 7 samples in VB

wp7_signature_banner_smFor a while, those of us who love Visual Basic have been struggling to make sure that newly released platforms include support for VB. When platforms that cater to the hobbyist, such as the new Windows Phone 7 tools are introduced without support for VB, we are particularly saddened. Happily, the team worked hard to overcome this shortcoming and announced today availability of the Windows Phone 7 tools in Visual Basic using Silverlight. You can download the tools now.image

In celebration of this opportunity, I have converted my RX samples over to VB and made them available on my downloads page. I’ll post explanations of each of the samples over the next few days. For now, feel free to download the samples and try them out for yourself. Here’s the list of samples that are included:

Posted on - Comment
Categories: VB Dev Center - Rx -

Reactive Framework Sorting it out

When I started looking at the Reactive Framework, one of the first things I did was to try creating some of the same standard LINQ queries that I’ve used against LINQ to Objects:


        Dim unsorted = From s In Sensor
                       Where s.SensorType = "2"
                       Order By s.SensorValue
                       Select s

If you try this where Sensor is an IObservable rather than IEnumerable, you will find that the Order By clause generates the following compiler error in VB: Definition of Method OrderBy is not accessible in this context. C# generates a similar but different error: “Could not find an implementation of the query pattern for source type IObservable<T>. OrderBy not found.” Essentially, the compiler is telling you that there isn’t an extension method called OrderBy that extends IObservable. Did the reactive team make a mistake and forget to implement sorting? Far from it.

Let’s consider the uses of the standard query operators  over a data source where you don’t necessarily know when the source ends. “From” doesn’t really exist, it’s just a query holder for identifying the local variable name (s) used later in the query and the source of the data (Sensor).

With “Where”, we are filtering the results. We can filter results over an ongoing stream without needing to know when the stream will end. As a result, filtering isn’t much of an issue.

Similarly, “Select” simply takes the input type and transforms it into another type. This is commonly referred to as a Projection. Since projections work equally well over data streams, we are fine implementing that in Reactive.

Sorting on the other hand is a bit more problematic. Consider the case where we process the following values: 1, 2, 4, 3, 5. It’s not difficult to sort these values and return them. However, what would happen to our sort if the next value that was sent was 0? We would need to reevaluate the entire result set and inject our new value before the first value that came in. In dealing with continuous event streams, we have no way of knowing whether the next value we are going to receive will need to be inserted prior to other results.

As a result, we need to partition the sets of data we receive if we need to sort these values so that we can be assured of knowing when the last value is received from this set. The Reactive Framework supports a number of partitioning methods, including BufferWithTime, BufferWithCount, and BufferWithTimeOrCount. With these methods, we can partition our streams into pre-determined chunks based on a timespan, and/or item count. The result is a new stream of IObserverable objects that contain an IList of the original data type. In the case of our Sensors, we can partition our result sets as follows:


Dim segmented = Sensor.BufferWithTime(TimeSpan.FromSeconds(3))

This creates a variable of type IObservable(Of IList(Of SensorInfo)). If we wanted, we could then display the sorted values in the partitioned lists as follows:


  segmented.Subscribe(Sub(val) FilteredList.ItemsSource =
                                         From v In val
                                         Order By v.SensorValue)

As you can see, you CAN sort values using the Reactive Framework using partitioning schemes, but it doesn’t make as much sense over data streams as it does with IEnumerable data sources typically encountered with LINQ.

Posted on - Comment
Categories: Rx - VB Dev Center -

Reactive Framework Subscribing to Events

Previously in my Reactive framework series, we saw how to create and subscribe to ongoing observable objects. While there are a number of cases where you would want to create your own observable type, often you simply want to compose reactive sequences in response to events raised by other means. Recently, I came across a simple example that can  show you how easy it is to subscribe to event and add functionality through the Reactive Framework’s extension methods.

In this scenario, I needed to update a list of most recently used files in real time. Whenever a new file was added , modified or deleted from a directory, I wanted my UI list to reflect this change. I’ve long known about the FileSystemWatcher class in Windows Forms. It is able to listen for create, change, delete and modify events in a specified file path and let us know when the file changes. Using Rx, we can create an observable using the following:


   Dim createWatcher As New FileSystemWatcher With {.Path = "C:\Temp", .EnableRaisingEvents = True}
   Dim createdEvent = Observable.FromEvent(Of FileSystemEventArgs)(createWatcher, "Created")

Using the Observable.FromEvent, we indicate that we want to watch for events with the specified name (Created) from the supplied instance object (createWatcher). With this observable, we can now perform other operations on the resulting events. We’ll use the “Do” method to perform an action (refreshing the file list). Before we do this action on the UI, we’ll need to make sure to synchronize back to the UI thread:


    Dim AllEvents = Observable.FromEvent(Of FileSystemEventArgs)(createWatcher, "Created").
                    ObserveOnDispatcher.
                    Do(Sub(fsArgs) RefreshFileList()).
                    Subscribe

This would be fine if we only wanted to watch for the events when the file is first created. However, in a Most Recently Used (MRU) list, we want to also know when a file is changed or deleted. Rather than wiring up separate handlers for each of these events, we can use the Merge method to listen to any of these events regardless of which event handler they came from:


    Dim AllEvents = Observable.FromEvent(Of FileSystemEventArgs)(createWatcher, "Created").
                    Merge(Observable.FromEvent(Of FileSystemEventArgs)(createWatcher, "Changed")).
                    Merge(Observable.FromEvent(Of FileSystemEventArgs)(createWatcher, "Deleted")).
                    ObserveOnDispatcher.
                    Do(Sub(fsArgs) RefreshFileList()).
                    Subscribe

One of the great things about the Reactive Framework is the ability to inject functionality into the event pipeline easily. For example, if we want to avoid responding to multiple events on the same file, we could inject the DistinctUntilChanged method as follows:


    Dim AllEvents = Observable.FromEvent(Of FileSystemEventArgs)(createWatcher, "Created").
                    Merge(Observable.FromEvent(Of FileSystemEventArgs)(createWatcher, "Changed")).
                    Merge(Observable.FromEvent(Of FileSystemEventArgs)(createWatcher, "Deleted")).
                    DistinctUntilChanged.
                    ObserveOnDispatcher.
                    Do(Sub(fsArgs) RefreshFileList()).
                    Subscribe

Quick and easy (and elegant as well.) If you want to try this out, download RX_Wpf in the files section here and run the FileWatcher.xaml file.

Posted on - Comment
Categories: VB Dev Center - Rx -

Reactive Framework as a Background Worker

In this introduction to the Reactive Framework series, we’ve spent a bit of time setting up our Observable and Observers and wiring them up. If you haven’t been following along, here’s links to the previous posts:

So far, our observers can listen to our sensor, but it turns out, we can’t know about it because everything is happening on the main worker thread. Because the thread is continually processing, the UI locks us out of seeing the updates. In order to solve this, we need to run our sensor on a secondary thread.

With Reactive Framework, we often talk about “Hot” and “Cold” observables. Hot observables are ones which are running independently of the subscription. Cold observables are ones where the process starts when you subscribe to it. In our case, we’re simulating an ongoing sensor that we are connecting many observers to. In this case, we are dealing with a “Hot” observable. As a result, we’ll explicitly manage the sensor using the BackgroundWorker object in our “Start” button handler:


        Dim worker As New BackgroundWorker
        AddHandler worker.DoWork, Sub(s As Object, ars As DoWorkEventArgs)
                                      Sensor.StartSensor()
                                  End Sub
        worker.RunWorkerAsync(Sensor)

Now, when we run our sample and output our results using Console.WriteLine, we see our results and we can continue to click on other buttons in our application. However, if we try to output the results to our user interface, we see the following exception:

     InvalidOperationException: The calling thread cannot access this object because a different thread owns it.

If you’ve ever worked with background threads in Windows Forms, WPF or Silverlight, you should recognize that you can’t access the UI thread from a background thread directly. One of the key scenarios that the Reactive Framework was designed to combat was asynchronous operations. As a result, they took great effort to make synchronizing these threads easy. Two of the extension methods on IObservable are SubscribeOn and ObserveOn. SubscribeOn is used indicate where the operations that we are subscribing to will be performed. ObserveOn is used to indicate where we want to process the results.

In our case, we need to move back to the UI thread when we process the results, thus we need to synchronize our threads when we Observe, thus we will use the ObserveOn option. To make matters easier, the Reactive team have included a special variant of the ObserveOn to synchronize it on the dispatching thread: ObserveOnDispatcher. We can alter our subscribing code as follows to make sure we observe our subscription on the UI Thread:


        Dim items = New ObservableCollection(Of Double)
        FilteredList.ItemsSource = items

        Dim TypeSensors = From s In Sensor
                       Where s.SensorType = "4"
                       Select s.SensorValue

        TypeSensors.ObserveOnDispatcher.Subscribe(
            Sub(item) items.Add(item))

To see this sensor and various observables in action, download the corresponding WPF project for this series.

Posted on - Comment
Categories: VB Dev Center - Rx - Visual Studio -

Reactive Framework Subscribing to Observables

It’s been a while since I started the Reactive Framework series. In case you missed the earlier posts, here’s the links to what we’ve done so far:

At this point, we’ve created our observer and set up the the logic that handles our OnNext values. What we haven’t done yet is wired our LINQ based processing pipeline to the event source. To do this, we need to Subscribe the handler to the Observables. By default, we need to create a new class that implements IObserver. To keep this simple, let’s just output the values to the console for now:


Class ConsoleObserver(Of T)
    Implements IObserver(Of T)


    Public Sub OnCompleted() Implements System.IObserver(Of T).OnCompleted

    End Sub

    Public Sub OnError(ByVal [error] As System.Exception) Implements System.IObserver(Of T).OnError

    End Sub

    Public Sub OnNext(ByVal value As T) Implements System.IObserver(Of T).OnNext
        Console.WriteLine(value.ToString)
    End Sub
End Class

The IObserver interface has three methods: the method that is fired when the source is done (OnCompleted), the method that occurs when an exception occurs (OnError) and the method that is used when each new value is received (OnNext). In the case of this simple example, we only implement the OnNext method and output the value to the Console Window.

With this in place, we can tie this all together creating and starting our sensor, filtering and projecting from the values (using LINQ) and displaying the values (through the new ConsoleObserver):


Dim Sensor As New ObservableSensor
Sensor.Start()

Dim lowvalueSensors = From s In Sensor
                      Where s.SensorValue < 3
                      Select s.SensorValue

lowvalueSensors.Subscribe(New ConsoleObserver(Of Double))

Thankfully, consuming our Observable chain doesn’t require creating a new class. Observable.Subscribe offers an additional overload which, instead of taking an IObservable, we can use an Action Lambda. As a result, we can restate our example above as follows:


Dim Sensor As New ObservableSensor
Sensor.Start()

Dim lowvalueSensors = From s In Sensor
                      Where s.SensorValue < 3
                      Select s.SensorValue

lowvalueSensors.Subscribe(Sub(value) Console.WriteLine(value))

While this consuming code is slightly longer, the total net effect is significantly more maintainable code since we don’t need to declare a separate class to just output our results.

We now could have a fully functioning set of examples. Unfortunately or example at this point is extremely unresponsive because we are completely CPU bound constantly running all of the process on the current thread. Up next time, moving our logic to the background thread.

Posted on - Comment
Categories: Rx - VB - VB Dev Center -

Reactive Framework Getting your LINQ on

Last time in our exploration of the Reactive Framework, we built a random Observable event generator. Now that we have our data source, we can start working with it. In the past, we would have hooked up event handlers to the event delegate and imperatively interacted with the values passed in the sender and EventArgs. Of course, when we Thinq LINQ, we try to find simpler, more declarative models to represent our intent.

To start, we need to instantiate and start our event generator:


Private Sensor as New ObservableSensor
Sensor.StartSensor()

Now that we are generating Observables, we can process them using LINQ query comprehensions. For example, if we wanted to filter out only the sensors who's type is "4", we could use this LINQ:


Dim TypeSensors = From s In Sensor
                  Where s.SensorType = "4"
                  Select s

If we wanted to filter out only those sensor readings that are low (less than 3), and only return the sensor's value, we could include the filter (Where) and projection (Select). The following results in an IObservable(Of Double) rather than IObservable(Of SensorInfo) that we started with.


Dim lowValueSensors = From s In Sensor
                      Where s.SensorValue < 3
                      Select s.SensorValue

Of course, if you prefer the Lambda syntax over query comprehensions, you can use those interchangeably with rX just as you would with LINQ. The following query waits for the first case where the sensor value is high (over 17) and fires OnNext returning a boolean once the value is hit.


Dim AnySensor = Sensor.Any(Function(s) s.SensorValue > 17)

All's well in LINQ land with rX, right? Well kind of. Doing these simple projections and filters are straight forward. However, if we start trying to use sorting, grouping, and aggregations, we start running into additional challenges. With these query types, we can't start returning results until the entire set of events is known. Since we're working with a potentially infinite stream of events, we will need to figure out how to partition the results and work with those segments. That will be a task for a future post.

Posted on - Comment
Categories: Rx - VB Dev Center - VB -

Reactive Framework Building an IObservable Event Generator

In my last post, I mentioned a number of cases where you may want to use the Reactive Framework. For some upcoming presentations, I wanted to focus on a couple of these scenarios, particularly on how you can use the Reactive Framework (rX) to work with events from device sensors. You can often find these kind of sensors in a number of industries, including Robotics, automated manufacturing systems, Medical monitors, Telecom usage, and Live financial feeds. In order to demonstrate using rX in this environment, I needed to build a module that simulated generating a bunch of sample random events. Below is the module that I created. We’ll use this module in some of the future discussions of Reactive Framework.

We’re going to start with a small class that will contain the state of the individual sensor events. We’ll call this our SensorInfo class. It will hold values for the date and time that the event occurred, an indicator on the sensor’s type and the value that the sensor returns. We'll also override the ToString method to allow us to output the values easily.


Public Class SensorInfo
    Public Property TimeStamp As DateTime
    Public Property SensorType As String
    Public Property SensorValue As Double

    Public Overrides Function ToString() As String
        Return String.Format("Time: {0}  , Type: {1}  Value: {2}", TimeStamp, SensorType, SensorValue)
    End Function
End Class

Now that we have our instance class, we can create a class that will generate these sensor items randomly. (This class is not thread safe, nor is it truly random so don't use it in production applications. It is merely designed for demonstration purposes.)


Public Class ObservableSensor

    Private _running As Boolean

    Public Sub StartSensor()
        If Not _running Then
            _running = True
            Dim randomizer = New Random(Date.Now.Millisecond)
            While _running
                Dim randVal = randomizer.NextDouble
                Dim info As New SensorInfo With {.SensorType = CInt(randVal * 4).ToString(),
                                                 .SensorValue = randVal * 20,
                                                 .TimeStamp = Now}
                End If
                Threading.Thread.Sleep(CInt(randomizer.NextDouble * 500))
            End While
        End If
    End Sub

    Public Sub StopSensor()
        _running = False
    End Sub

End Class

In this class, we maintain an internal variable (_running) which tracks whether the sensor is running or not. We also have a method that Starts the sensor and stops it. While the sensor is running, we essentially generate a number of SensorInfo instances with randomized values and then pause for a random period of time before creating another value. At this point, the values that are returned don’t have much meaning. We could easily change this to return stock quotes, manufacturing defects or other sensor responses by manipulating the values this randomizer generates.

Now that we can generate random SensorInfos, we need to actually do something with them. In the past, we could just raise an event for consumers to handle after we generate each sensor’s value. Since we want to leverage the power of the new IObservable/IObserver interfaces and the Reactive Framework, I’ll make this class implement IObservable(Of T) so that we can register a number of IObserver clients and notify them each time we generate a new sensor.

The IObservable(Of T) interface requires of a single method: Subscribe. This takes a single parameter which is the IObserver client that wants to listen to our sensor data. It returns a class that implements IDisposable (so that we can make sure each of our observers know when we’re done sending them data). Since the return object here is actually the ObservableSensor itself, we need to implement both IObservable and IDisposable. Here's our revised ObservableSensor class.


Public Class ObservableSensor
    Implements IObservable(Of SensorInfo)
    Implements IDisposable

    Private _observers As New List(Of IObserver(Of SensorInfo))
    Private _running As Boolean

    Public Function Subscribe(ByVal observer As System.IObserver(Of SensorInfo)) 
                              As System.IDisposable 
                              Implements System.IObservable(Of SensorInfo).Subscribe
        _observers.Add(observer)
        Return Me
    End Function

    Public Sub StartSensor()
        If Not _running Then
            _running = True
            Dim randomizer = New Random(Date.Now.Millisecond)
            While _running
                Dim randVal = randomizer.NextDouble
                If _observers.Any Then
                    Dim info As New SensorInfo With {.SensorType = CInt(randVal * 4).ToString,
                                                     .SensorValue = randVal * 20,
                                                     .TimeStamp = Now}

                    _observers.ForEach(Sub(o) o.OnNext(info))
                End If
                Threading.Thread.Sleep(CInt(randomizer.NextDouble * 500))
            End While
        End If
    End Sub

    Public Sub StopSensor()
        _running = False
    End Sub


#Region "IDisposable Support"
    Private disposedValue As Boolean ' To detect redundant calls

    ' IDisposable
    Protected Overridable Sub Dispose(ByVal disposing As Boolean)
        If Not Me.disposedValue Then
            If disposing Then
                If _observers IsNot Nothing Then
                    _observers.ForEach(Sub(o) o.OnCompleted())
                    _observers.Clear()
                End If
                ' TODO: dispose managed state (managed objects).
            End If

            ' TODO: free unmanaged resources (unmanaged objects) and override Finalize() below.
            ' TODO: set large fields to null.
        End If
        Me.disposedValue = True
    End Sub

    ' This code added by Visual Basic to correctly implement the disposable pattern.
    Public Sub Dispose() Implements IDisposable.Dispose
        ' Do not change this code.  Put cleanup code in Dispose(ByVal disposing As Boolean) above.
        Dispose(True)
        GC.SuppressFinalize(Me)
    End Sub
#End Region

End Class

In this new version, we now have a new _observers object that maintains a list of the observers (clients). This allows us to notify multiple sensor handlers and work with them how they deem appropriate. The subscribe method simply takes the supplied observer and sticks it in the collection.

When we start the sensor, we now check to see if there are any observers (using the LINQ .Any method). If we do, we’ll generate the random sensor data. We then notify all of the listeners using the list .ForEach method passing the lambda expression instructing the observer to invoke it’s OnNext handler (part of the IObserver(Of T) implementation. This is the method which corresponds to IEnumerable’s MoveNext. It is this method which will trigger our reactive framework’s event pipeline to begin processing our sensor notifications.

When we’re done, we need to clean up our resouces. In the Disposing event, we make sure that we call the OnCompleted method on each (ForEach) of the observers in our _observers collection. We also clear the observer collection to remove the reference pointers between the client and our sensor generator.

There you have it, a generic random event generator that we can consume with the Reactive Framework (or similar technologies like StreamInsight). Next time, we’ll start to consume these events.

As always, let me know what you Thinq and if there are any modifications I should consider.

Posted on - Comment
Categories: Rx - VB - VB Dev Center - Visual Studio -

LINQ to CSV using DynamicObject and TextFieldParser

In the first post of this series, we parsed our CSV file by simply splitting each line on a comma. While this works for simple files, it becomes problematic when consuming CSV files where individual fields also contains commas. Consider the following sample input:

CustomerID,COMPANYNAME,Contact Name,CONTACT_TITLE
ALFKI,Alfreds Futterkiste,Maria Anders,"Sales Representative"
ANATR,Ana Trujillo Emparedados y helados,Ana Trujillo,"Owner, Operator"
ANTON,Antonio Moreno Taqueria,Antonio Moreno,"Owner"

Typically when a field in a CSV file includes a comma, the field is quote escaped to designate that the comma is part of the field and not a delimiter. In the previous versions of this parser, we didn’t handle these cases. As a result the following unit test would fail given this sample data:


    <TestMethod()>
    Public Sub TestCommaEscaping()
        Dim data = New DynamicCsvEnumerator("C:\temp\Customers.csv")
        Dim query = From c In data
                    Where c.ContactTitle.Contains(",")
                    Select c.ContactTitle

        Assert.AreEqual(1, query.Count)
        Assert.AreEqual("Owner, Operator", query.First)
    End Sub

We could add code to handle the various escaping scenarios here. However, as Jonathan pointed out in his comment to my first post there are already methods that can do CSV parsing in the .Net framework. One of the most flexible ones is the TextFieldParser in the Microsoft.VisualBasic.FileIO namespace. If you code in C# instead of VB, you can simply add a reference to this namespace and access the power from your language of choice.

Retrofiting our existing implementation to use the TextFieldParser is fairly simple. We begin by changing the _FileStream object to being a TextFieldParser rather than a FileStream. We keep this as a class level field in order to stream through our data as we iterate over the rows.

In the GetEnumerator we then instantiate our TextFieldParser and set the delimiter information. Once that is configured, we get the array of header field names by calling the ReadFields method.


    Public Function GetEnumerator() As IEnumerator(Of Object) _
        Implements IEnumerable(Of Object).GetEnumerator

        _FileStream = New Microsoft.VisualBasic.FileIO.TextFieldParser(_filename)
        _FileStream.Delimiters = {","}
        _FileStream.HasFieldsEnclosedInQuotes = True
        _FileStream.TextFieldType = FileIO.FieldType.Delimited

        Dim fields = _FileStream.ReadFields
        _FieldNames = New Dictionary(Of String, Integer)
        For i = 0 To fields.Length - 1
            _FieldNames.Add(GetSafeFieldName(fields(i)), i)
        Next
        _CurrentRow = New DynamicCsv(_FileStream.ReadFields, _FieldNames)

        Return Me
    End Function

    Public Function MoveNext() As Boolean Implements IEnumerator.MoveNext
        Dim line = _FileStream.ReadFields
        If line IsNot Nothing AndAlso line.Length > 0 Then
            _CurrentRow = New DynamicCsv(line, _FieldNames)
            Return True
        Else
            Return False
        End If
    End Function

While we are at it, we also change our MoveNext method to call ReadFields to get the parsed string array of the parsed values in the next line. If this is the last line, the array is empty and we return false in the MoveNext to stop the enumeration. We had to make one other change here because in the old version, we passed the full unparsed line in the constructor of the DynamicCsv type and did the parsing there. Since our TextFieldParser will handle that for use, we’ll add an overloaded constructor to our DynamicCsv DynamicObject accepting the pre parsed string array:


Public Class DynamicCsv
    Inherits DynamicObject

    Private _fieldIndex As Dictionary(Of String, Integer)
    Private _RowValues() As String

    Friend Sub New(ByVal values As String(),
                   ByVal fieldIndex As Dictionary(Of String, Integer))
        _RowValues = values
        _fieldIndex = fieldIndex
    End Sub

With these changes, now we can run our starting unit test including the comma in the Contact Title of the second record and it now passes.

If you like this solution, feel free to download the completed Dynamic CSV Enumerator library and kick the tires a bit. There is no warrantee expressed or implied, but please let me know if you find it helpful and any changes you would recommend.

Posted on - Comment
Categories: Dynamic - LINQ - VB - VB Dev Center - Visual Studio -

LINQ to CSV using DynamicObject Part 2

In the last post, I showed how to use DynamicObject to make consuming CSV files easier. In that example, we showed how we can access CSV columns using the standard dot (.) notation that we use on other objects. Using DynamicObject, we can refer to item.CompanyName and item.Contact_Name rather than item(0) and item(1).

While I’m happy about the new syntax, I’m not content with replacing spaces with underscores as that doesn’t agree with the coding guidelines of using Pascal casing for properties. Because we have control on how the accessors work, we can modify the convention. Let’s reconsider the CSV file that we’re working with. Here’s the beginning:

CustomerID,COMPANYNAME,Contact Name,CONTACT_TITLE,Address,City,Region,PostalCode,Country,Phone,Fax
ALFKI,Alfreds Futterkiste,Maria Anders,Sales Representative,Obere Str. 57,Berlin,NULL,12209,Germany,030-0074321,030-0076545
ANATR,Ana Trujillo Emparedados y helados,Ana Trujillo,Owner,Avda. de la Constituci¢n 2222,Mexico D.F.,NULL,5021,Mexico,(5) 555-4729,(5) 555-3745
ANTON,Antonio Moreno Taqueria,Antonio Moreno,Owner,Mataderos  2312,Mexico D.F.,NULL,5023,Mexico,(5) 555-3932,NULL

Notice here that the header row contains values with a mix of mixed case, all upper, words with spaces, and underscores. To standardize this, we could parse the header and force an upper case at the beginning of each word. That would take a fair amount of parsing code. As a fan of case insensitive programming languages, I figured that if we just strip the spaces and underscores and work against the strings in a case insensitive manner, I’d be happy. In the end, we’ll be able to consume the above CSV with the following code:


Dim data = New DynamicCsvEnumerator("C:\temp\Customers.csv")
Dim query = From c In data
            Where c.City = "London"
            Order By c.CompanyName
            Select c.ContactName, c.CompanyName, c.ContactTitle

To make this change, we change how we parse the header row and the binder name when fetching properties. In our DynamicCsvEnumerator, we already isolated the parsing of the header with a GetSafeFieldName method. Previously we simply returned the input value replacing a space with an underscore. Extending this is trivial:


    Function GetSafeFieldName(ByVal input As String) As String
        'Return input.Replace(" ", "_")
        Return input.
            Replace(" ", "").
            Replace("_", "").
            ToUpperInvariant()
    End Function

That's it for setting up the header parsing changes. We don't need to worry about spaces in the incoming property accessor because it's not legal to use spaces in a method name. I'll also assume that the programmer won't use underscores in the method names by convention. Thus, the only change we need to make in our property accessor is to uppercase the incoming field name to handle the case insensitivity feature. Here's the revised TryGetMember implementation.


    Public Overrides Function TryGetMember(ByVal binder As GetMemberBinder,
                                           ByRef result As Object) As Boolean
        Dim fieldName = binder.Name.ToUpperInvariant()
        If _fieldIndex.ContainsKey(fieldName) Then
            result = _RowValues(_fieldIndex(fieldName))
            Return True
        End If
        Return False
    End Function

All we do is force the field name to upper case and then we can look it up in the dictionary of field indexes that we setup last time. Simple yet effective.

Posted on - Comment
Categories: LINQ - VB Dev Center - Visual Studio - Dynamic -

LINQ to CSV using DynamicObject

When we wrote LINQ in Action we included a sample of how to simply query against a CSV file using the following LINQ query:


From line In File.ReadAllLines(“books.csv”) 
Where Not Line.StartsWith(“#”) 
Let parts = line.Split(“,”c) 
Select Isbn = parts(0), Title = parts(1), Publisher = parts(3)

While this code does make dealing with CSV easier, it would be nicer if we could refer to our columns as if they were properties where the property name came from the header row in the CSV file, perhaps using syntax like the following:


From line In MyCsvFile
Select line.Isbn, line.Title, line.Publisher

With strongly typed (compile time) structures, it is challenging to do this when dealing with variable data structures like CSV files. One of the big enhancements that is coming with .Net 4.0 is the inclusion of Dynamic language features, including the new DynamicObject data type. In the past, working with dynamic runtime structures, we were limited to using reflection tricks to access properties that didn't actually exist. The addition of dynamic language constructs offers better ways of dispatching the call request over dynamic types. Let's see what we need to do to expose a CSV row using the new dynamic features in Visual Studio 2010.

First, let's create an object that will represent each row that we are reading. This class will inherit from the new System.Dynamic.DynamicObject base class. This will set up the base functionality to handle the dynamic dispatching for us. All we need to do is add implementation to tell the object how to fetch values based on a supplied field name. We'll implement this by taking a string representing the current row. We'll split that based on the separator (a comma). We also supply a dictionary containing the field names and their index. Given these two pieces of information, we can override the TryGetMember and TrySetMember to Get and Set the property based on the field name:


Imports System.Dynamic

Public Class DynamicCsv
    Inherits DynamicObject

    Private _fieldIndex As Dictionary(Of String, Integer)
    Private _RowValues() As String

    Friend Sub New(ByVal currentRow As String,
                   ByVal fieldIndex As Dictionary(Of String, Integer))
        _RowValues = currentRow.Split(","c)
        _fieldIndex = fieldIndex
    End Sub

    Public Overrides Function TryGetMember(ByVal binder As GetMemberBinder,
                                           ByRef result As Object) As Boolean
        If _fieldIndex.ContainsKey(binder.Name) Then
            result = _RowValues(_fieldIndex(binder.Name))
            Return True
        End If
        Return False
    End Function

    Public Overrides Function TrySetMember(ByVal binder As SetMemberBinder,
                                           ByVal value As Object) As Boolean
        If _fieldIndex.ContainsKey(binder.Name) Then
            _RowValues(_fieldIndex(binder.Name)) = value.ToString
            Return True
        End If
        Return False
    End Function
End Class

With this in place, now we just need to add a class to handle iterating over the individual rows in our CSV file. As we pointed out in our book, using File.ReadAllLines can be a significant performance bottleneck for large files. Instead we will implement a custom Enumerator. In our customer enumerable, we initialize the process with the GetEnumerator method. This method opens the stream based on the supplied filename. It also sets up our dictionary of field names based on the values in the first row. Because we keep the stream open through the lifetime of this class, we implement IDisposable to clean up the stream.

As we iterate over the results calling MoveNext, we will read each subsequent row and create a DynamicCsv instance object. We return this row as an Object (Dynamic in C#) so that we will be able to consume it as a dynamic type in .Net 4.0. Here's the implementation:


Imports System.Collections

Public Class DynamicCsvEnumerator
    Implements IEnumerator(Of Object)
    Implements IEnumerable(Of Object)

    Private _FileStream As IO.TextReader
    Private _FieldNames As Dictionary(Of String, Integer)
    Private _CurrentRow As DynamicCsv
    Private _filename As String

    Public Sub New(ByVal fileName As String)
        _filename = fileName
    End Sub

    Public Function GetEnumerator() As IEnumerator(Of Object) _
        Implements IEnumerable(Of Object).GetEnumerator

        _FileStream = New IO.StreamReader(_filename)
        Dim headerRow = _FileStream.ReadLine
        Dim fields = headerRow.Split(","c)
        _FieldNames = New Dictionary(Of String, Integer)
        For i = 0 To fields.Length - 1
            _FieldNames.Add(GetSafeFieldName(fields(i)), i)
        Next
       
        Return Me
    End Function

    Function GetSafeFieldName(ByVal input As String) As String
        Return input.Replace(" ", "_")
    End Function

    Public Function GetEnumerator1() As IEnumerator Implements IEnumerable.GetEnumerator
        Return GetEnumerator()
    End Function

    Public ReadOnly Property Current As Object Implements IEnumerator(Of Object).Current
        Get
            Return _CurrentRow
        End Get
    End Property

    Public ReadOnly Property Current1 As Object Implements IEnumerator.Current
        Get
            Return Current
        End Get
    End Property

    Public Function MoveNext() As Boolean Implements IEnumerator.MoveNext
        Dim line = _FileStream.ReadLine
        If line IsNot Nothing AndAlso line.Length > 0 Then
            _CurrentRow = New DynamicCsv(line, _FieldNames)
            Return True
        Else
            Return False
        End If
    End Function

    Public Sub Reset() Implements IEnumerator.Reset
        _FileStream.Close()
        GetEnumerator()
    End Sub

#Region "IDisposable Support"
    Private disposedValue As Boolean ' To detect redundant calls

    ' IDisposable
    Protected Overridable Sub Dispose(ByVal disposing As Boolean)
        If Not Me.disposedValue Then
            If disposing Then
                _FileStream.Dispose()
            End If
            _CurrentRow = Nothing
        End If
        Me.disposedValue = True
    End Sub

    ' This code added by Visual Basic to correctly implement the disposable pattern.
    Public Sub Dispose() Implements IDisposable.Dispose
        Dispose(True)
        GC.SuppressFinalize(Me)
    End Sub
#End Region

End Class

Now that we have our custom enumerable, we can consume it using standard dot notation by turning Option Strict Off in Visual Basic or referencing it as a Dynamic type in C#:

VB:



Public Sub OpenCsv()
    Dim data = New DynamicCsvEnumerator("C:\temp\Customers.csv")
    For Each item In data
        TestContext.WriteLine(item.CompanyName & ": " & item.Contact_Name)
    Next

End Sub

C#:


[TestMethod]
public void OpenCsvSharp()
{
    var data = new DynamicCsvEnumerator(@"C:\temp\customers.csv");
    foreach (dynamic item in data)
    {
        TestContext.WriteLine(item.CompanyName + ": " + item.Contact_Name);
    }
}

In addition, since we are exposing this as an IEnumerable, we can use all of the same LINQ operators over our custom class:

VB:


Dim query = From c In data
            Where c.City = "London"
            Order By c.CompanyName
            Select c.Contact_Name, c.CompanyName

For Each item In query
    TestContext.WriteLine(item.CompanyName & ": " & item.Contact_Name)
Next

C#:


[TestMethod]
public void LinqCsvSharp()
{
    var data = new DynamicCsvEnumerator(@"C:\temp\customers.csv");
    var query = from dynamic c in data 
                where c.City == "London"
                orderby c.CompanyName
                select new { c.Contact_Name, c.CompanyName };

    foreach (var item in query)
    {
        TestContext.WriteLine(item.CompanyName + ": " + item.Contact_Name);
    }
}

Note: This sample makes a couple assumptions about the underlying data and implementation. First, we take an extra step to translate header strings that contain spaces to replace the space with an underscore. While including spaces is legal in the csv header, it isn't legal in VB to say: " MyObject.Some Property With Spaces". Thus we'll manage this by requiring the code to access this property as follows: "MyObject.Some_Property_With_Spaces".

Second, this implementation doesn't handle strings that contain commas. Typically fields in CSV files that contain commas are wrapped by quotes (subsequently quotes are likewise escaped by double quotes). This implementation does not account for either situation. I purposely did not incorporate those details in order to focus on the use of DynamicObject in this sample. I welcome enhancement suggestions to make this more robust.

Posted on - Comment
Categories: LINQ - VB Dev Center - VB - C# - Dynamic -

Pin DataTips in Visual Studio 2010

While debugging in Visual Studio 2010, I noticed that the DataTip now has a new feature. At the right hand side of the variable window, there is now a pin icon.

image

Clicking on this pin icon adds the DataTip to the code window allowing it to float over the existing text.

image

In addition to allowing you to drill into the variable’s values as you would in the watch, locals, or autos windows, You can also add comments which remain with the pinned DataTip.

image

When you stop debugging, the DataTip will disappear. However when you debug into this method again, it will re-appear as long as it is pinned. As a bonus, it will persist even after closing and re-opening Visual Studio.

While Visual Studio 2010 definitely has some rough edges, I continue to be amazed by some of the new UX features that the next version will bring.

Posted on - Comment
Categories: Visual Studio - VB Dev Center -

Setting DataContext Connection String in Data Tier

LINQ to SQL offers a quick mechanism to build a data tier in a n-Tier application. There’s a challenge when using the DBML designer in a Class Library. The designer stores the connection string in the Settings file. Although it appears that you can change it in the config file, any changes you make there will be ignored because they are actually retained in the generated Settings file.

While you could go into the DataContext’s .Designer file and change the location of the connection string, any changes you make there will be overwritten if you make changes to the DBML file. So what are you to do?

Remember, one of the nice features added to VB 9 and C# 3 was partial properties. With the DataContext, the code generators add a OnCreated method that is called as part of the context’s constructors. As a result, we can implement the partial method in a separate partial DataContext class that is not changed when the DBML is regenerated. Here’s a sample to do that for the context on this site (LinqBlogDataContext):


Imports System.Configuration

Public Class LinqBlogDataContext
  Private Sub OnCreated()
    MyBase.Connection.ConnectionString = _
      ConfigurationManager.ConnectionStrings("LinqBlogConnectionString") _
         .ConnectionString
    End Sub
End Class

When you do this, you can change the connection string in the app.config or web.config and it will be picked up in the business tier correctly. Realize that the design surface will still use the value in the settings in& the Class Library project instead of the config file.

Posted on - Comment
Categories: LINQ - VB Dev Center - VB -

Testing to see if a record Exists in LINQ to SQL

There are a number of options you can consider when testing to see if a record exists using LINQ to SQL. Which one should you use? It depends… In general, check the generated SQL for various options in SQL Management Studio to see the how the various execution plans compare. For example, each of the following can tell you if a record exists.


Dim q1 = Customers.FirstOrDefault(Function(c) c.City="London") 

Dim q2 = Customers.Count(Function(c) c.City="London") 

Dim q3 = Customers.Any(Function(c) c.City="London") 

If we take a look at the generated SQL, we'll see that these produce the following SQL Statements:


-- Region Parameters
DECLARE @p0 NVarChar(6) = 'London'
-- EndRegion
SELECT TOP (1) [t0].[CustomerID], [t0].[CompanyName], [t0].[ContactName], 
   [t0].[ContactTitle], [t0].[Address], [t0].[City], [t0].[Region], 
   [t0].[PostalCode], [t0].[Country], [t0].[Phone], [t0].[Fax]
FROM [Customers] AS [t0]
WHERE [t0].[City] = @p0
GO

-- Region Parameters
DECLARE @p0 NVarChar(6) = 'London'
-- EndRegion
SELECT COUNT(*) AS [value]
FROM [Customers] AS [t0]
WHERE [t0].[City] = @p0
GO

-- Region Parameters
DECLARE @p0 NVarChar(6) = 'London'
-- EndRegion
SELECT 
    (CASE 
        WHEN EXISTS(
            SELECT NULL AS [EMPTY]
            FROM [Customers] AS [t0]
            WHERE [t0].[City] = @p0
            ) THEN 1
        ELSE 0
     END) AS [value]

Analyzing this in SQL Management Studio shows that the second and third options take roughly 30% of the total time where the first takes 40%. As a result, it would appear that the first last two options were faster than the first. Realizing that the first has to hydrate a full object (and throw it away), we can recognize that there is additional data being transferred across the wire which can slow things down as well.

That doesn't tell the whole story. In general, I would expect that the last option would perform best because SQL Server could stop as soon as it finds the first matching record, where-as, it has to process the entire list (or index) to do a count. Usually with large volumes of data, Exists will out perform Count.

If we check the overall performance of the second two options, we can see that typically the last option performs slower than the second. I suspect this is due to the time it takes to process the expression tree and generate the SQL for these two methods. Also, the underlying data (Northwind) doesn't have that many records so the processing time may be unrealistic as compared to results from a larger database.

Also consider whether you need to work with the results or just want to know if they exist. If you need to work with them once you've determined that they exist, then using the .Any/Exists option would cause you to need a separate query to fetch the actual objects. In that case, FirstOrDefault would be better as you only need a single query to the database.

In general, there is no silver bullet. You need to test your options and determine which is best for your current situation.

Posted on - Comment
Categories: LINQ - VB Dev Center -

Euler Primes with LINQ Iterators

Thanks' to Steve Smith’s Project Euler with LINQ, I’ve recently begun playing with the Project Euler questions seeing how far I can push my algorithm skills along with LINQ and LINQPad. LINQPad makes it easy to slap together some code samples and output results, including performance metrics, so I don’t need to worry with that plumbing code and can focus on creating fast code.

While working on one of the problems I realized the solution would offer a good opportunity to demonstrate a nuance of .Net iterators. Understanding this nuance may help some of you to grok how they work inside LINQ to Objects. In this case, the problem is Euler 10: Find the sum of all primes less than 2,000,000. We can begin by putting this into a LINQ query which elegantly reflects the desired outcome:


Dim query = _
  Aggregate num In Enumerable.Range(2, 1999998) _
  Where IsPrime(num) _
  Into Sum(CLng(num))

So far, so good. We just need to identify which numbers are prime and we’re finished. Easier said than done. I started by brute forcing this, which is often not a good idea with Euler which requires that your results take less than a second to process. My first stab was to set-up a list of found prime numbers. Then as I was iterating over my source data, I would iterate over each of the previously found primes to see if this number is divisible by that number. If it is, I would return that it was not prime. Otherwise, I would add it to the prime list and return that it is a prime. Here’s some code:


Private FoundPrimes As New List(Of Long)
  Function IsPrime(ByVal num As Long) As Boolean
  For Each prime In FoundPrimes
    If num Mod prime = 0 Then
      Return False
    End If
  Next

  Return AddToPrimes(num)
End Function 

Function AddToPrimes(ByVal num) As Boolean
  FoundPrimes.Add(num)
  Return True
End Function

Ok, so this works, but is quite slow (several minutes for 2 million numbers). We can’t speed this up using the PLINQ here because we need to move through our list in order, otherwise we will be adding numbers to our prime list that aren’t actually primes.

We can easily modify this by limiting the set of primes that we test a given number against. We know from mathematical proofs that the highest number we need to test any given number to see if it is a prime, is the square root of that given number. Thus, we modify the IsPrime method as follows:


Function IsPrime(ByVal num As Long) As Boolean
    Dim top = math.sqrt(num)
    For Each prime In FoundPrimes
        If prime > top Then Return AddToPrimes(num)
        If num Mod prime = 0 Then
            Return False
        End If
    Next
    Return AddToPrimes(num)
End Function

Running this through LINQPad now returns a respectable performance of 2.003 seconds. This is pretty good, but still doesn’t fit Euler’s guideline of sub-second performance. Back to the grindstone to find a better way. The performance hog here is obviously repeated iterating over the prime list for each given number. Perhaps if we could somehow flag multiples of each prime in our list as we find a prime, we could eliminate this iteration process. Thus instead of using a list of primes, we’ll create an array of bits (boolean) the size of the range we want to test:


Const numCount As Integer = 2000000
Dim allNums(numCount - 1) As Boolean

So, how do we find the primes in this list. First, realize that the Where and Select LINQ operators have overloads that include not only the input value, but also the index for the current value. To use this, we will need to modify our query because we can’t access this index using query expressions (at least not in VB). We’ll have to change our query to something like the following:



Dim query = allNums.Where(Function(num, index) IsPrime(index)) _
                   .Select(Function(num, index) index) _
                   .Sum(Function(num) num)

This would work, but the index for our Select method is not the index of the underlying data source, but rather the index for the item returned in the Where clause. As a result, we’ll need to process this index through our Where function and expose that in a scope wide enough that we can call back into it in our Select method. Here’s our revised code to this point, including a delegate pointer to a PrimeSieve method that will do the grunt work:


Const numCount As Integer = 2000000
Dim allNums(numCount - 1) As Boolean
Dim foundPrime As Integer 

Sub Main()
    Dim query = allNums _
            .Where(AddressOf PrimeSieve) _
            .Select(Function(ignore) CLng(foundPrime)) _
            .Sum(Function(num) num)
    Console.WriteLine(query)
End Sub

One thing to point out here is that we could have eliminated the Select clause if we were dealing with smaller numbers, but we need to widen our results to a long because the sum will overflow an integer type otherwise.

Now, on to our revised Prime algorithm. In this one, we pass in the current bit which we ignore and the index. The definition of Where requires that this returns a Boolean, so we’ll return false if this number has already been marked as a prime multiple. Otherwise, we’ll mark all multiples of this number as no longer prime and return true:


Private Function PrimeSieve(ByVal num1 As Boolean, ByVal index As Integer) As Boolean
    If index < 1 Then Return False
    If allNums(index) Then Return False 

    foundPrime = index + 1
    For num = index To (numCount - 1) Step foundPrime
        allNums(num) = True
    Next 

    Return True
End Function

At this point, you may be questioning if the underlying query will work if we keep referring back to a single FoundPrime variable. Here is where understanding iterators becomes important. Let’s begin by considering the basic definition of Where and Select (We’ll have to use C# here because VB doesn’t have a syntax for iterators yet):


static IEnumerable<t> Where<t>(IEnumerable<t> source, Func<t , bool> predicate) {
    foreach (T element in source) {
        if (predicate(element)) yield return element;
    }
}

static IEnumerable<s> Select<t , S>(IEnumerable<t> source, Func<t , S> selector) {
    foreach (T element in source) {
        yield return selector(element);
    }
}

What these are basically saying is: as some outside method calls us to move through our list, return values that meet the appropriate results. We are NOT doing a full iteration over our list in the Where method and returning the results to the next method (Select). If we did, select would indeed be using the same foundPrime value. Instead, we start moving through the results getting the first value that meets our criteria and passing control of our application on to the Select method. Select operates over it and passes control on to the next level - Sum. Sum then aggregates the result and pulls the next value returning control to the Where clause to find the next matching result.

Let’s step through the process with the first several numbers. The operation in our query doesn’t start until we actually start walking through the results. Thus Sum causes us to fetch the first number in our list. Where passes index 0 to the predicate PrimeSieve which returns false because it is under 1 (0 and 1 are not considered primes). Where continues to the next bit (index 1).

Since 2 is a prime, we then mark all multiples of 2 (4, 6, 8, 10, etc) true and return true. Because the predicate evaluated true, we yield that on to the Select method which pulls the foundPrime value and passes that on to Sum.

Sum then asks for the next number (3). Now, we re-enter the Where clause after the yield (internally using goto) and continue the iteration. We now do a PrimeSieve for index 2. This bit is still false, so we mark all multiples (6,9,12) as true. Of course 1/2 of these values are already true. I suspect that checking the bits before setting them would only slow our operation down, so I just go ahead and set it. We now pass operation on to select which pulls the new foundPrime and passes that value on to Sum to aggregate.

In the next iteration, we find that allNums(3) (the fourth number) is already marked true, thus we return false and to where and continue on to index 4 which is not yet marked true because this is the prime value 5. Rinse and repeat and we can efficiently evaluate as many numbers as we need.

So after all of this, what’s the performance difference?

Test Range

Speed with brute force

Speed with Sieve

% Improvement

1000

.004

.002

200%

10000

.007

.005

140%

1,000,000

.055

.009

611%

10,000,000

.0829

.080

1036%

100,000,000

15.439

1.189

1298%

1,000,000,000

5699.4

13.078

43580%

A couple things to mention here:

  • When dealing with small sets of iterations the amount of improvement is relatively small. This points to the reminder that you shouldn’t over optimize code if you don’t need to. The benefits as we increase the iterations becomes dramatic.
  • The order in which the results are pulled is important. Thus, you can’t parallelize this algorithm (using PLINQ). In this example, the prime check has side effects of changing the underlying data source and setting an external property.
  • This version relies on the way iterators work. You would not be able to substitute an observer pattern (like in LINQ to Events and the Reactive Framework). That is a push model rather than a pull model. As a result, it could be possible that you are processing the number 4 before it has been marked as not prime and your results would thus be invalid.
Posted on - Comment
Categories: LINQ - VB Dev Center -

Generating Interfaces for LINQ to SQL Entities

At DevLink I had the pleasure of presenting a session on LINQ to SQL Tricks and Tips. The slides and demos for LINQ to SQL Tricks and Tips are available on my download page if you are interested. Following the session, an attendee voiced the desire for LINQ to SQL to create interfaces while it creates the class definitions in order to make it easier to mock the entities in unit testing scenarios.

As part of the presentation, I showed Damien Guard’s L2ST4 code generation template. The great thing about these templates is that they are fully customizable.  If you’ve been following this blog, you may remember my post showing adding property Get logging to LINQ to SQL with T4. In this post, I’m going to show you how to add the ability to generate and implement interfaces for the table’s columns. I’m only going to show implementing the table’s columns and not the associations. Additionally, you will need to modify this if you use inheritance in your LINQ to SQL models. I’m using the VB Template in this example, but the C# changes are very similar. Hopefully, you will see how easy it is to make these kinds of changes and can add these extensions yourself if necessary.

Ok, so let’s get started. First off, we will set up a flag in the options so that we can toggle creating the interfaces in our code generation. At the top of the file, change the declaration of the options anonymous type adding the CreateInterfaces = true as follows:

var options = new {
	DbmlFileName = Host.TemplateFile.Replace(".tt",".dbml"), // Which DBML file to operate on (same filename as template)
	SerializeDataContractSP1 = false, // Emit SP1 DataContract serializer attributes
	FilePerEntity = false, // Put each class into a separate file
	StoredProcedureConcurrency = false, // Table updates via an SP require @@rowcount to be returned to enable concurrency	
	EntityFilePath = Path.GetDirectoryName(Host.TemplateFile), // Where to put the files	
	CreateInterfaces = true // Add interfaces for each table type
};

Next, we define the interfaces. To make things easier, we’ll just declare the interface for each table just before we define the table itself. This will keep the interfaces in the same namespace and the same file as the tables (if you use the FilePerEntity option). The biggest trick is to figure out where to insert this code.  Search for the following text in the existing template: “if (data.Serialization && class1.IsSerializable) {“. Change the template between the Namespace declaration and the serialization settings as follows:

Namespace <#=data.EntityNamespace#>	

<#		}
#>		

<#
if (options.CreateInterfaces) { #>
	'Interface
	<#=code.Format(class1.TypeAttributes)#>Interface I<#=class1.Name#>
	<#			foreach(Column column in class1.Columns) {#>
	<# if (column.IsReadOnly) {#>ReadOnly <#}#>Property <#=column.Member#> As <#=code.Format(column.Type)#>
	<# } 
	#>
	End Interface
<# } 
#>
<#		if (data.Serialization && class1.IsSerializable) {

Here we create an interface named IClass where Class is the actual name of the class we are going to generate.  Once we have the interface created, we iterate over each of the columns defining the properties that correspond to the table’s columns.

Next, we need to alter the Class definition to have it implement our new interface. Scrolling down about 15 lines, find the line where we declare that the class implements the INotifyPropertyChanging and INotifyPropetyChanged interfaces. Change this line to read as follows:

	Implements INotifyPropertyChanging, INotifyPropertyChanged<#
if (options.CreateInterfaces) {#>, I<#=class1.Name#><#}#>

If we were using C#, our job would be done. However, VB requires that interfaces be implemented explicitly.  Since we are generating this code, making this last change is relatively easy as well. Scroll down to the definition of the property getter and setter (in my copy, this is line 334). Change the property definition to read as follows:

#>		<#=code.Format(column.MemberAttributes)#><# if (column.IsReadOnly) {#>ReadOnly <#}#>Property <#=column.Member#> As <#=code.Format(column.Type)#><# 
			if (options.CreateInterfaces) {#> Implements I<#=class1.Name#>.<#=column.Member#><#}#>

			Get

Ok. We’re done. All we need to do is save our changes to the TT file and it will regenerate the classes from our DBML assuming you have already set your project up to use the TT files rather than the default LINQ to SQL generator.

Posted on - Comment
Categories: VB - LINQ - VB Dev Center -

MVC Sitemap Navigation with XML Literals

As I continue re-writing this site to use MVC, I find more substructures that I can refactor easily. In the original implementation for the menu, I used the asp:Menu control. As I was working to theme the site, I had problems getting it to work acceptably with CSS styles. I also didn't like the reliance on tables.

In an effort to improve on it, I found the method of using unordered lists with list items for the menus (<ul> <li>). I then moved to a modified version of the UL SiteMap menu discussed by Byant Likes.

In moving to MVC, I was looking for a better option and found one on the MVC tutorial site. This builds the menu with a StringBuilder. I had a couple problems with this implementation however:

  • By using a StringBuilder, you don't get compiler assistance in validating the markup.
  • The implementation doesn't handle security trimming.
  • It only handles a single level of menu items.
  • For Each loops seem to be an anti-pattern for me with my LINQ experience.

To fix these issues, I figured we could re-write this relatively easily using XML Literals and VB with a recursive call to handle the potential n-levels of SiteMenuItems possible in the SiteMap XML specification. Even without adding any of the additional functionality, we can drastically simplify the implementation in the MVC Tutorial by re-writing it in XML Literals in VB:


Dim xMenu1 = <div class="menu">
                <ul id="menu">
                   <%= From node In SiteMap.RootNode.ChildNodes _
                                    .OfType(Of SiteMapNode)() _
                       Select <li class=<%= if(SiteMap.CurrentNode Is node, _
                                               "active", _
                                               "inactive") %>>
                                 <a href=<%= % node.Url>>
                                    <%= helper.Encode(node.Title) %></a>
                     </li> %>
                 </ul>
              </div>
Return xMenu1.ToString()

There are a couple things to point out here. First, the SiteMap.RootNode.ChildNode property returns a list of Object rather than SiteMapNode items. We can fix that by purposely limiting the results and strongly typing it at the same type using the .OfType(Of T) extension method.

Second, We eliminate the for each loop by projecting the new li items in the select clause. In this, we use the ternary If to determine if the node in the iteration is the same as the one selected. If it is, we apply the "active" style. The rest is relatively straight forward.

At this point, we have fixed issues 1 and 4 from my objection list above. Next, let's deal with the n=levels of menu items. To do this, we will replace the LINQ query with a recursive function call. The Menu helper extension now looks like the following:


Imports System.Runtime.CompilerServices

Namespace Helpers
    Public Module MenuHelper
        <Extension()> _
        Public Function Menu(ByVal helper As HtmlHelper) As String
            Dim xMenu = <div class="menu">
                            <ul id="menu">
                                <%= AddNodes(SiteMap.RootNode, helper) %>
                            </ul>
                        </div>
    
            Return xMenu.ToString()
        End Function
    End Module
End Namespace

Now we have an extremely clean and concise XML building of the basic structure. The hard work comes in the AddNodes method.


Private Function AddNodes(ByVal currentNode As SiteMapNode, ByVal helper As HtmlHelper) _
                 As IEnumerable(Of XElement)

   Return From node In currentNode.ChildNodes.OfType(Of SiteMapNode)() _
          Select <li class=<%= If(SiteMap.CurrentNode Is node, "active", "inactive") %>>
                     <a href=<%= node.Url %>><%= helper.Encode(node.Title) %></a>
                     <%= If(node.ChildNodes.Count > 0, _
                          <ul class="child">
                              <%= AddNodes(node, helper) %>
                          </ul>, Nothing) %>
                  </li>
End Function

Essentially this function moves the LINQ projection we did in the original re-write into a separate method. This method takes a SiteMapNode and returns rendered lists of XElements. In addition to generating the HTML for the current node's children, we also check to see if the respective child in turn has  are any child nodes and if so, we create a new unordered list and recursively call back into AddNodes to render those children as well. By using a recursive function, we can handle any level of children nodes that the SiteMap throws at us.

In order to add security trimming, we simply need to add a where clause. The SiteMap contains a non-generic IList that happens to contain strings. To use LINQ on this, we use the .Cast method to turn it into a generic IEnumerable(Of String). With that in place, we can use the .Any extension method to find if any of the roles specified in the SiteMap source meet the criteria where the HttpContext's current user is in at least one of those roles. We'll also check to see if there are no roles specified for that node (which means that it is not trimmed and all users can access the node). Here is the revised body of the AddNodes method:


Return From node In currentNode.ChildNodes.OfType(Of SiteMapNode)() _
       Where node.Roles.Count = 0 OrElse _
             node.Roles.Cast(Of String).Any(Function(role) _
                                            HttpContext.Current.User.IsInRole(role)) _
       Select <li class=<%= If(SiteMap.CurrentNode Is node, "active", "inactive") %>>
                  <a href=<%= node.Url %>><%= helper.Encode(node.Title) %></a>
                  <%= If(node.ChildNodes.Count > 0, _
                      <ul class="child">
                          <%= AddNodes(node, helper) %>
                      </ul>, Nothing) %>
              </li>

That's it. The only thing left is to manage the css styles to handle the fly-outs. You should be able to find plenty of sites that demonstrate how to set that up. If you can thinq of any enhancements to this, let me know.

Posted on - Comment
Categories: MVC - VB Dev Center - VB -

Binding Anonymous Types in MVC Views

While translating this site over to MVC, I ran into a challenge when converting the RSS feed implementation. Currently I'm using XML Literals to generate the RSS and I could certainly continue to use that track from the Controller similar to the Sitemap implementation on Mikesdotnetting. However, putting the XML generation in the controller directly conflicts with the separation of concerns that MVC embraces. If I were only displaying one RSS feed, I might be willing to break this here. However, I'm rendering a number of different RSS feeds here: Posts, Posts by Category, and Files.

Since it would be good to have a reusable view, I decided to create a single view which various controllers can use. I was dynamically generating the XML in the past so my queries would now need to project into a type that the view can consume. Here we have several alternatives:

  1. Create a strongly typed object structure which is strictly used to shape our results for the shared Rss view.
  2. Project into a list of System.ServiceModel.SyndicationItem and then bind to that.
  3. Project into an anonymous type and figure out a way to bind to that projection in our view.

I initially thought I would go down the second route similar to the implementation discussed on the DeveloperZen post. However, I wanted to support some of the RSS extensions including comments and enclosures that aren't directly supported in that implementation.

At first I was unsure how to bind an anonymous projection in a View, so I eliminated option 3 and implemented option 1 similar to the strongly typed implementation discussed on Mikesdotnetting blog. To do this, I needed to build the following set of strongly typed structures:


Public Structure RssElement
    Public Title As String
    Public Link As String
    Public PubDate As DateTime
    Public PermaLink As String
    Public TrackBackUrl As String
    Public CommentRss As String
    Public CommentUrl As String
    Public CommentCount As Integer
    Public Description As String
    Public Categories() As Category
    Public Enclosures() As Enclosure
End Structure

Public Structure Category
    Public Url As String
    Public Title As String
End Structure

Public Structure Enclosure
    Public Url As String
    Public Length As Integer
    Public Type As String
End Structure

If I were using VB 10, this would have been done with auto-implemented properties. However I went with structures at this point because I didn't want to type that much for something that was going to be view only anyway.

With this structure in place, I could go ahead and implement the controller and view. The controller simply projected into this new object structure in the Select clause of a LINQ query. The view then was able to consume this as we could strongly type the view as a ModelView(Of IEnumerable(Of RssElement)). Here's the view that I created:

<%@ Page Language="VB" ContentType="application/rss+xml"
Inherits="System.Web.Mvc.ViewPage(Of IEnumerable(Of RssElement))" %> <rss version='2.0' xmlns:dc='http://purl.org/dc/elements/1.1/' xmlns:slash='http://purl.org/rss/1.0/modules/slash/' xmlns:wfw='http://wellformedweb.org/CommentAPI/' xmlns:trackback='http://madskills.com/public/xml/rss/module/trackback'> <channel> <title>Thinq Linq</title> <link><%=Url.Action("Post") %></link> <description>LINQ and related topics.</description> <dc:language>en-US</dc:language> <generator>LINQ</generator> <% For Each item In Model%> <item> <title><%=item.Title%></title> <link><%=item.Link%></link> <pubDate><%=item.PubDate%></pubDate> <guid isPermaLink="false"><%= item.PermaLink %></guid> <dc:creator>jwooley</dc:creator> <slash:comments><%=item.CommentCount%></slash:comments> <trackback:ping><%=item.TrackBackUrl%></trackback:ping> <comments><%=item.CommentUrl%></comments> <wfw:commentRss><%=item.CommentRss%></wfw:commentRss> <wfw:comment><%=item.CommentUrl%></wfw:comment> <description><%=Html.Encode(item.Description)%></description> <% if Not item.Categories is Nothing then %> <%For Each c In item.Categories%> <category domain="<%= c.Url %>"><%=c.Title%></category> <% Next %> <% End If%> <%If Not item.Enclosures is Nothing then %> <% For Each e In item.Enclosures%> <enclosure url='<%=e.Url %>' length='<%=e.length %>' type='<%=e.type %>' /> <% Next%> <% end if %> </item> <% Next%> </channel> </rss>

In comparing this code with the XML Literal implementation, they are amazingly similar. With MVC, I may be able to live without XML Literals in the views as we simply replace a LINQ projection with a For Each loop. Notice here I check to see if the Categories and Enclosures objects exist before I enumerate over each of those arrays. This is because the Post feed doesn't include enclosures and the File feed doesn't need Categories. This flexibility allows us to create a reusable view for all of our needs.

But, I'm not quite happy with this implementation. I would prefer not to have to declare the additional structure layer just to pass the view something to consume. In this case, it feels like we are having the Controller consume the data Model and create a ModelView (RssElement) to be consumed by the View. We don't really need a new pattern (M-C-MV-V), do we? Instead, I would like to be a bit more "Dynamic" in my implementation so that I didn't need this class and could simply project into an anonymous type and eliminate the RssElement structures entirely.

After a bit of reflection, I realized that this is a case where VB is uniquely positioned crossing the bridge between strong typing and dynamic languages. Normally, I do not recommend using the Option Strict Off option, but this is one case where it does come in useful. To begin, we'll remove those pesky structures. Next, we'll change the controllers to project into anonymous types. Here's the revised code for the Post Rss Controller:


    Function ShowPosts() As ActionResult
        Dim posts = From p In (From post In Context.PostItems _
                    Order By post.PublicationDate Descending _
                    Take 20).AsEnumerable _
                    Select New With { _
                        .Description = p.Description, _
                        .Link = Url.Action("Title/" & p.TitleUrlRewrite & ".aspx", "Post"), _
                        .PubDate = p.PublicationDate.ToString("r"), _
                        .Title = p.Title, _
                        .TrackBackUrl = Url.Action("Trackback/" & p.Id, "Seo"), _
                        .PermaLink = "42f563c8-34ea-4d01-bfe1-2047c2222a74:" & p.Id, _
                        .Categories = (From c In p.CategoryPosts _
                                    Select New With { _
                                         .Title = c.Category.Title, _
                                         .Url = Url.Action("Category/" & c.CategoryID, _
                                                           "Post")}).ToArray, _
                        .Commentrss = Url.Action("Comment/" & p.Id, "Rss"), _
                        .CommentUrl = Url.Action("Title/" & p.TitleUrlRewrite, "Post"), _
                        .CommentCount = p.Comments.Count, _
                        .Enclosures = Nothing}

        Return View("ShowFeed", posts.ToList)
    End Function

Notice here, that we have to be very careful with our property naming and can't leave off anything. This is why we have to initialize our .Enclosures property to Nothing because we can't initialize it to an empty collection. Since our view checks to see if the object is null before binding it, we are fine here.

Now back to the view. How do we tell the view what type of data the Model contains if we can't name it? Here's where option strict off comes in handy. However, in a View page, we can't simply state Option Strict Off at the top of our code. Instead, we need to set the CompilerOptions to set optionstrict- as follows:

<%@ Page Language="VB" ContentType="application/rss+xml" 
CompilerOptions="/optionstrict-" Inherits="System.Web.Mvc.ViewPage" %>

In this case, we are not only setting the CompilerOptions, but removing the generic type definition in the Inherits clause. The rest of the view remains intact. Now, we can consume our anonymous type (because we aren't typing the view) and let the Option Strict setting dynamically resolve our method and type names. Notice here, if we were using C# 4.0, we wouldn't be able to use the Dynamic option and state that the page inherits ViewPage<Dynamic> because we can't project into a Dynamic type in our LINQ query.

Now that we have modified our view, we can reuse it. First move it to the Shared folder so that the view will be accessible regardless of which controller tries to consume it. Next, we create other controllers making sure that all of the properties are projected correctly in our LINQ query.


    Function Rss() As ActionResult
        Return View("ShowFeed", _
            From f In GetFiles() _
            Select New With { _
                .Description = f.Description, _
                .Link = "http://www.ThinqLinq.com/" & f.URL, _
                .PubDate = f.LastWriteTime.ToString("r"), _
                .PermaLink = "42f563c8-34ea-4d01-bfe1-2047c2222a74:" & f.Name, _
                .TrackBackUrl = "", _
                .CommentRss = "", _
                .CommentUrl = "", _
                .CommentCount = 0, _
                .Categories = Nothing, _
                .Title = f.Name, _
                .Enclosures = New Object() {New With { _
                                            .Length = f.Length, _
                                            .Type = "application/x-zip-compressed", _
                                            .Url = "http://www.ThinqLinq.com/" & f.URL}}})
    End Function

 

Be aware. Here we are playing with the dangerous part of dynamic languages. We no longer get the compiler to ensure that our type includes all of the necessary properties. If we forget a property or mis-type the property name, we will only know about it when a run-time exception is thrown. Of course, since this is MVC, we can use unit tests to check our type. With dynamic programming, think of the compiler as just another unit test. You need to write the rest of them by hand.

While I like the flexibility that the new dynamic option provides, I miss the comfort that comes from strong typing. Also, I haven't checked the performance differences between these implementations and suspect that the previous strongly typed option may out perform this one. With optimizations in VB 10 around Option Strict Off, I suspect that the performance differences may shrink, but would need to test this as well.

I'll also admit to being relatively new to MVC and welcome better alternatives from those who have been using it longer. What do you Thinq?

Posted on - Comment
Categories: VB Dev Center - VB - LINQ -

VB Syntax Highlighting with JQuery and Chili

At CodeStock, I attended Rod Paddock's intro to JQuery session since I hadn't played with JQuery yet. As often happens when I go to conferences, being in the different environment starts to get the mind thinking in different ways. Sometimes the benefit of the conference isn't necessarily something stated directly, but rather a thought when the mind wanders. One such thought occurred during Rod's presentation where I thought that it might be interesting to "query" sets of text over a larger document and apply formatting to selected words (similar to how Visual Studio colors keywords and other code elements).

A quick search on syntax highlighting found that I was not alone thinking that JQuery might be a good option for syntax highlighting. Chili is a JQuery based code highlighter that already supports a number of languages. It is relatively easy to incorporate into the site. First, you need to add a script reference to JQuery by adding the following:

<script type="text/javascript" src=http://jquery.com/src/jquery-latest.pack.js" />

Next, we add a link to the chili script code:

<script type="text/javascript" src="jquery/chili/jquery.chili-2.2.js" />

Third, we designate the path that contains the various language specific implementation details in a script block:

<script id="setup" type="text/javascript"> 
    ChiliBook.recipeFolder = "jquery/chili/";
</script>

Now, when we want to add highlighting to our code, we include it inside a <code> tag that is assigned to the class name of the language we want to colorize. Unlike the popular SyntaxHighlighter, Chili doesn't require you to specify the location of each language individually. It loads it dynamically based on matching up the file name with the class name.

To see how I added colorization to the source on the above code, see how the actual code is wrapped by a <code class="js"> … </code> tag. In this case, there is a js.js file in the recipeFolder that Chili uses to highlight this code. In addition, I'm wrapping the code tag inside a pre tag to eliminate otherwise unnecessary markup (like &nbsp; and <br />). This makes copying and pasting the code easier.

<pre><code class="js">


  <script id="setup" type="text/javascript">
     ChiliBook.recipeFolder = "jquery/chili/";
  </script>
</code></pre>

There's a problem with directly integrating Chili into this site however. Chili does not include a native VB syntax highlighter. However, adding new code definitions is as simple as adding a new .js file containing a collection of JSON objects defining what terms are to be colorized and how the styles should be applied. For the current VB implementation, I've added the following colorizations:

  • Comments are green
  • String literals are red
  • Processing instructions (like #Region and #If) are silver
  • Keywords and LINQ keywords are #4040c2
  • XML Literal expression hole symbols (<%= and %>) are a bit different as they use a yellow background with dark gray foreground, but we can easily set this through the style tag.

To do this, we set up a JSON structure to contain the various Regular Expression match patterns and the corresponding styles:

{
      _name: "vb"
    , _case: true
    , _main: {
          com    : { 
              _match: /'.*/ 
            , _style: "color: green;"
        }
        , string : { 
              _match: /(?:\'[^\'\\\n]*(?:\\.[^\'\\\n]*)*\')|(?:\"[^\"\\\n]*(?:\\.[^\"\\\n]*)*\")/ 
            , _style: "color: red;"
        }
        , preproc: { 
              _match: /^\s*#.*/ 
            , _style: "color: silver;"
        }
        , keyword: { 
              _match: /\b(?:AddHandler|AddressOf|AndAlso|Alias|And|Ansi|As|Assembly|Auto|
Boolean|ByRef|Byte|ByVal|Call|Case|Catch|
CBool|CByte|CChar|CDate|CDec|CDbl|Char|CInt|Class|CLng|CObj|
Const|CShort|CSng|CStr|CType|Date|Decimal|Declare|Default|Delegate|
Dim|DirectCast|Do|Double|Each|Else|ElseIf|End|Enum|Erase|Error|Event|Exit|
False|Finally|For|Friend|Function|Get|GetType|GoSub|GoTo|Handles|
If|Implements|Imports|In|Inherits|Integer|Interface|Is|Let|Lib|Like|Long|Loop|
Me|Mod|Module|MustInherit|MustOverride|MyBase|MyClass|
Namespace|New|Next|Not|Nothing|NotInheritable|NotOverridable|
Object|On|Option|Optional|Or|OrElse|Overloads|Overridable|Overrides|
ParamArray|Preserve|Private|Property|Protected|Public|
RaiseEvent|ReadOnly|ReDim|REM|RemoveHandler|Resume|Return|
Select|Set|Shadows|Shared|Short|Single|Static|Step|Stop|String|Structure|
Sub|SyncLock|Then|Throw|To|True|Try|TypeOf|Unicode|Until|Variant|
When|While|With|WithEvents|WriteOnly|Xor)\b/ 
            , _style: "color: #4040c2;"
        }
        , linqkeyword: { 
              _match: /\b(?:From|Select|Where|Order By|Descending|Distinct
|Skip|Take|Aggregate|Sum|Count|Group|Join|Into|Equals)\b/ 
            , _style: "color: #4040c2;"
        }
        , xmlexpressionhole: {
              _match: /\<%=|\%>/
            , _style: "background: #fffebf; color: #555555;"
        }
    }
}

If you want, you can download this file at http://www.thinqlinq.com/jquery/chili/vbasic.js. Now, to use the new definition, simply add the file to the path you defined as the chiliBook.RecipeFolder above. Then add code to your page like the following:


<pre><code class="vbasic">
Private Function FormatCategories(ByVal post As PostItem) As String If post.CategoryPosts.Count > 0 Then 'Categories found. Return them Dim response = _ From catPost In post.CategoryPosts _ Select val = _ <span> <a href=<%= "default.aspx?CategoryId=" & _ catPost.Category.CategoryId %>> <%= catPost.Category.Description.Trim %></a> </span>.ToString() Return String.Join(", ", response.ToArray) Else Return "" End If End Function
</code></pre>

There are a couple issues with this implementation: First, the highlighting only works if you view the page on the site. If you are viewing this through an aggregator, you won't see the syntax highlighting. Personally, I find this to be an acceptable tradeoff to the alternative--injecting the styles inline with the code as is done with the CopySourceAsHtml project or the Windows Live Writer VSPaste plug-in. Although the code is correctly highlighted when viewed from an aggregator, it is horrendous when consumed by a reader for the blind with screen reader systems.

The second issue with this implementation is that it doesn't take context into account. As a result, if you have an object with the same name as one of the keywords, it will be highlighted  incorrectly. This will become more of an issue in VS 2010 when we (finally) get type colorization in VB. To do type colorization, we need access to the object symbols which are unavailable outside of the compiler's environment.

The third issue is that this version doesn't correctly colorize the XML Literals. While I'm sure it is possible, I'm not enough of a Regular Expression expert to figure out all of the options required to enable syntax highlighting for the XML Literals. If someone wants to add that, I would love to try out your suggestions.

Posted on - Comment
Categories: VB - JQuery - VB Dev Center -

Fetching XML from SQL Server using LINQ to SQL

With SQL Server, you can use the For Xml clause (read more in BOL). The quickest option is to add For XML Auto at the end of a SQL statement. You can do this with dynamic SQL or inside a stored proc. If you use a stored proc, the DBML tool doesn't recognize this as XML (and return it as an XElement as it does for XML Data type columns).

Regardless of whether you are using stored procs or dynamic SQL, the server returns the result as an array of strings broken up into 4000 character chunks. It is your responsibility to piece this back together. You can concatenate the strings and parse the XML, however there is no true root node in this return set, only a series of XML elements.

Since you are not going to be able to rely on the generated method stub for the procedure, you may want to consider using ExecuteQuery directly and handle the string parsing. If you define this in a partial class for your context, it will appear to calling code as if it came directly from the database pre formatted.  For example, here is some code that returns the customers from Northwind as an XElement:

Public Function CustomerAsXml() As XElement
    Dim returnVal = Me.ExecuteQuery(Of String)("Select * from Customers For XML Auto")
    Dim fullString = String.Concat((From x In returnVal Select x).ToArray)
    Dim xml = XElement.Parse("<root>" & fullString & "</root>")
    Return Xml
End Function

You could substitute the name of your stored proc with parameters in the place of this dynamic SQL and it should work equally well.

Dim returnVal = Me.ExecuteQuery(Of String)("CustomersXml", New Object() {})
 
Posted on - Comment
Categories: LINQ - VB - VB Dev Center - Linq to XML -

LinqDataSource and CUD operations with Inheritance

When I added the Pingbacks and Trackbacks, I changed the implementation of the Comments to use an inheritance model (TPH) where Comments, Trackbacks, and Pingbacks all inherit from the abstract CommentBase class. To refresh your memory, here's the appropriate part of the DBML designer surface:

Comment inheritance model

While this works fine with minimal changes when viewing data, it can cause problems if you are using the LinqDataSource for editing values. When trying to update or delete a record, you may encounter a message similar to the following:

Cannot create an abstract class.

Description: An unhandled exception occurred during the execution of the current web request. Please review the stack trace for more information about the error and where it originated in the code.
Exception Details: System.MissingMethodException: Cannot create an abstract class.

So what's happening here? When the page is posted back, the LinqDataSource tries to re-create the object based on the incoming values so that it can reattach it to the context to perform the update. In the case of our polymorphic CommentBases collection, the data source doesn't know how to recreate the necessary object and thus throws the exception.

Normally when trying to override CRUD behaviors with the LinqDataSource, you would use the Deleting, Inserting, Updating, and Selecting methods. However in this case, the datasource has already tried to hydrate the object in order to pass it into the method handler as part of the event args. Thus handling it here is too late.

As an alternative, we can intercept the request earlier in the process. In the case of the ListView control, we can intercept this in the ItemDeleting, ItemUpdating and ItemInserting event handlers. The key here is that when we're done intercepting the request on this level that we need to keep the LinqDataSource from receiving the request by setting the Cancel property of the EventArgs and cleaning up ourselves.

When deleting a record, this is a simple process. First, we grab the key of the record being updated from the e.Keys collection. We then use that to search for the comments that have that ID passing the results to DeleteAllOnSubmit. Once the change is saved, we block future notifications setting the cancel. Here's a sample implementation:

Protected Sub ListView1_ItemDeleting(ByVal sender As Object, _
ByVal e As System.Web.UI.WebControls.ListViewDeleteEventArgs) _
Handles ListView1.ItemDeleting If e.Keys.Count > 0 Then Using dc As New LinqBlog.BO.LinqBlogDataContext dc.CommentBases.DeleteAllOnSubmit(From c In dc.CommentBases _ Where c.CommentId = CInt(e.Keys(0))) dc.SubmitChanges() 'Suppress the default behavior of the binding source 'The binding source doesn't know how to instantiate the 'original value type on post-back due to the MustInherit inheritance ListView1.EditIndex = -1 e.Cancel = True End Using End If End Sub

Updating follows a similar process. In this case, we get the object we are editing based on the value in the Keys and then replay the changes based on the arg's NewValues collection.

Protected Sub ListView1_ItemUpdating(ByVal sender As Object, _
ByVal e As System.Web.UI.WebControls.ListViewUpdateEventArgs) _
Handles ListView1.ItemUpdating If e.Keys.Count > 0 Then Using dc As New LinqBlog.BO.LinqBlogDataContext Dim id As Integer = CInt(e.Keys(0)) Dim item = dc.CommentBases.Where(Function(c) c.CommentId = id).FirstOrDefault If Not item Is Nothing Then 'Set the values For i = 0 To e.NewValues.Count - 1 CallByName(item, e.NewValues.Keys(i), CallType.Set, e.NewValues(i)) Next dc.SubmitChanges() 'Suppress the default behavior of the binding source 'The binding source doesn't know how to instantiate the 'original value type on post-back due to the MustInherit inheritance ListView1.EditIndex = -1 e.Cancel = True End If End Using End If End Sub

Since we're not implementing inserting from the grid for comments, we don't need to code that piece. I'll leave it to you, the reader to try that one. Realize that you will need to determine the CommentType prior to creating the appropriate instance object before setting the values on insert. Otherwise, the process is basically the same.

One other thing to keep in mind is handling concurrency. Because we are refetching the object as part of the CUD operation, we are effectively throwing away the true original values. Of course we can take care of that if we use a database issued timestamp (rowversion) for concurrency.

Posted on - Comment
Categories: VB Dev Center - VB - LINQ -

Implementing Pingbacks

Recently, I discussed Sending and Receiving TrackBacks on this blog. The TrackBack API is not the only mechanism which blog engines use to communicate between each other about the links that are included in individual posts. Wikipedia lists three methods to keep track of which articles are cross linked: Refback, Trackback, and Pingback. Of these, perhaps the trickiest to implement is the Pingback because it uses the xml-rpc style of communication rather than SOAP or REST that most .Net programmers are familiar with.

Thankfully, Cook Computing has released an open source implementation for xml-rpc which makes programming for it similar to programming .Net services by relying on attributes to specify the methods and contracts that a class consumes. The source and documentation can be found at http://www.xml-rpc.net/. Once we add a reference to the CookComputing.XmlRpcV2.dll, we can begin coding our Pingback implementation.

Unlike WCF, we don't need to perform a lot of configuration steps. We simply add a Generic Handler (.ashx) file and point it to the class that will perform the implementation. In this case, our handler will be called PingbackService.ashx and consists of the following:

<%@ WebHandler Language="VB" Class="LinqBlog.BO.Services.PingbackService" %>

As they say in the Staples advertisement, "That was easy!" Next, we implement the PingbackService class in our business tier. Similar to the Trackback, we'll separate this process into the sending operation and the receiving operation.

Sending Pingbacks

The steps to send a pingback are similar to those for a Trackback:

  • Find the links in our post.
  • Check the post's to see if the hosting server supports the Pingback API
  • Send the pingback to the hosting server's URI

The first step is identical to the TrackBack, so refer to the ParseAndSendTrackbacks method from my previous post for that code. Before we can send the ping, we need to check the server for our link to see if it supports Pingbacks. The Pingback Specification allows for two options for server discovery: based on the presence of a X-Pingback HTTP header looking like:

X-Pingback: http://www.ThinqLinq.com/Api/PingbackService.ashx

Or a <link> tag in the web page as follows:

<link rel="pingback" href="http://www.ThinqLinq.com/Api/PingbackService.ashx" />

Thus we need to check for the presence of either of these auto discovery mechanisms in our check to find the URI for the pingback server. Here's a sample implementation which takes a post's url and returns the url of the server's service (or an empty string if the server doesn't support Pingbacks).

Private Function GetPingbackServer(ByVal destination As String) As String
    Dim destUri As Uri = Nothing
    If Not Uri.TryCreate(destination, UriKind.Absolute, destUri) Then
        'Make sure we have a valid uri and that it isn't from this site (relative uri)
        Return ""
    End If

    Dim server As String = ""
    Dim req = DirectCast(WebRequest.Create(destination), HttpWebRequest)
    req.Referer = "http://www.thinqLinq.com"
    Using resp = DirectCast(req.GetResponse, HttpWebResponse)
        'Check headers for x-Pingback
        server = resp.Headers.Get("x-Pingback")
        If server <> "" Then Return server

        'Check for link element
        If resp.Headers.AllKeys.Contains("Content-Type") AndAlso _
            resp.Headers("Content-Type").StartsWith("text/html") Then

            Dim client As New WebClient()
            client.UseDefaultCredentials = True
            Dim page = client.DownloadString(destination)
            Dim regexString = _
String.Format("<link rel={0}pingback{0} href={0}[a-z0-9:\.\/_\?\-\%]*{0}", _
ControlChars.Quote) Dim match = Text.RegularExpressions.Regex.Match(page, regexString, _
RegexOptions.IgnoreCase).Value If Not String.IsNullOrEmpty(match) Then Dim startIndex As Integer = match.IndexOf("href=") + 6 Dim ret = match.Substring(startIndex, match.Length - startIndex - 1) Return ret End If End If End Using Return "" End Function

In this case, checking the headers is easy. Finding the <link> link tag takes a combination of Regular Expressions and string parsing since the <link> tag can either be HTML or XHTML compliant (and thus we can't use XML parsing on it. Now that we know the address of our post, the address that we're linking to and the address of the linking site's pingback server, we can issue the request to the server using Xmlrpc.Net. Here's the code:

Public Function SendPing(ByVal source As String, ByVal destination As String) As String

    Dim server = GetPingbackServer(destination)
    If Not server = "" Then
        Dim proxy As IPingbackPing = _
DirectCast(XmlRpcProxyGen.Create(GetType(IPingbackPing)), _
IPingbackPing) proxy.Url = server Return proxy.Ping(source, destination) End If Return "" End Function

Typically with the XMLRPC.Net, we specify the server's address in a static attribute on the service type. However, in our case the URL isn't known at compile time. As a result, we use the XmlRpcProxyGen.Create method to create a proxy for the RPC service at runtime. The Create method takes a type as an interface. We define the type with the required attributes as follows:

<XmlRpcUrl("http://ThinqLinq.com/SetAtRuntime.aspx")> _
 Public Interface IPingbackPing
    Inherits IXmlRpcProxy

    <XmlRpcMethod("pingback.ping")> _
    Function Ping(ByVal source As String, ByVal destination As String) As String

End Interface

Notice that the interface does specify a XmlRpcUrl. This is just a place holder which we replace in the SendPing method by setting the proxy.Url to the actual server's address. The act of calling the Ping method is trivial thanks to the generated proxy.

Receiving a Pingback

Switching gears to the server side now, receiving a ping is actually easy to do with the xml-rpc.net implementation. On our class, we inherit from the CookComputing.XmlRpc.XmlRpcService. This takes care of the core handler code and wiring up our method with the RPC call. To associate our method with the RPC method name, we add the CookComputing.XmlRpc.XmlRpcMethod specifying a method name of "pingback.ping" as required by the Pingback specification. This method takes two string parameters: source url and destination url; and returns a message indicating the outcome of the request.

      <XmlRpcMethod("pingback.ping")> _
      Public Function Ping(ByVal source As String, ByVal destination As String) As String
          Using dc As New LinqBlogDataContext
              'Get the post's ID based on the destination url from 
              'the custom URL Rewriting scheme.
              Dim postId As Integer = GetPostId(destination, dc)
              If postId > 0 Then
                  'Make sure we haven't already added this pingback
                  If (From c In dc.CommentBases _
                      Where c.CreatorLink = source And _
                          c.PostId = postId).Count = 0 Then

                      dc.CommentBases.InsertOnSubmit( _
                          New Pingback With {.CreatorLink = source, _
                                             .Description = "Pingback from " & source, _
                                             .EnteredDate = Now, _
                                             .PostId = postId, _
                                             .Creator = "Pingback", _
                                             .CreatorEmail = ""})
                      dc.SubmitChanges()

                      Return "pingback registered successfully"
                  Else
                      Return "pingback already registered"
                  End If
              Else
                  Return "pingback not registered, no post found"
              End If
          End Using
End Function

 

In this case, we need to make sure that a) the destination location does include a reference to a post (through the id in the querystring or via URL Rewriting). If we have a valid ID, we then check to see if there is already a linkback associated with this post and the given source address. We don't want to register duplicate linkbacks. This is particularly important since we are implementing both Postbacks and Trackbacks and the calling site could send both requests. We only want to register one. Assuming we have a new Pingback for this post, we will create a new Pingback object (which inherits from CommentBase) and set the necessary values. With LINQ to SQL, applying the change is the standard SubmitChanges call on the context. We finish by letting the client know what happened in our call. This is mostly a courtesy as the specification doesn't require a specific response outside of exception codes.

Feel free to pingback (or trackback) to this post to let me know if this series of posts has been helpful to you.

Posted on - Comment
Categories: VB Dev Center - VB - LINQ - SEO -

ADO.NET Entity Framework Documentation Samples in VB

Last week, I announced that my translations of the Entity Framework samples were available in VB. Today the ADO.Net team announced that next set have been posted. These are part of the ADO.Net Entity Framework Documentation Samples. These are the projects that are used in the EF quick start and walkthroughs that come with the .Net documentation. They are a set of mini applications demonstrating using EF within the context of an application.

The Course Manager sample was previously translated, but the HR Skills, Adventureworks data binding and research and collaboration tool were just updated today. Unlike the other samples, these don't have separate downloads for each sample, but rather have both C# and VB versions included with each download. Here's the description of each of these projects as taken from the MSDN site:

  • CourseManager.zip
    The CourseManager Windows forms application created by completing the Entity Framework quickstart.
  • HRSkillsCombined.zip
    This is a Visual Studio 2008 solution that contains both a Windows forms project and an ASP.NET project. Both samples demonstrate data binding to entity objects. The ASP.NET sample uses the EntityDataSource control for data binding.
  • AdWksSalesWinDataBind.zip
    The AdventureWorks Data Binding sample demonstrates data binding that uses the Entity Framework. This application displays and modifies SalesOrderDetail entities associated with SalesOrderHeader entities.
  • ResearchCollaborationAssistant.zip
    The Annotation and Research Collaboration Tool aids research and collaboration by creating reference annotations and contact entities that can be searched for both relevant Web pages and people associated with topics or search texts.

I hope you find these samples helpful. I'm not sure that I would recommend using the manual databinding that the team used when creating these samples as there are quite a few cases where they could have relied on native databinding rather than manually adding items to text boxes. These translations were fairly literal translations on purpose.

If you're just wanting to learn the query syntax and see the capabilities of EF, the Entity Framework Query Samples are a better source of information.

Thanks to both the VB Team and Data Team for recognizing the need for these samples in VB.

Posted on - Comment
Categories: Entity Framework - VB Dev Center - VB - ADO.Net Data Services -

Sending TrackBacks

Yesterday, I showed how we can receive trackbacks from other sites using the TrackBack API. Today, we'll look at the other side of this picture: sending TrackBacks to other sites based on links in the post. Sending a TrackBack entails several steps:

  1. Parsing the post to find links to other sites.
  2. Checking the target site to see if it supports TrackBacks.
  3. Formatting and sending the TrackBack to the target's service.
  4. Checking for error responses.

Thus with every post I create now, I include a method to parse the post and send the trackbacks as necessary. Parsing the post is a relatively painless process. In it, I query the raw HTML of the post finding anything that looks like href="….". Although I typically use LINQ for querying, in this case the best tool for the job is a regular expression. I admit that I'm not the best at regular expressions, so if anyone has a better alternative, let me know. Here's the method that starts the parsing process:


Public Shared Sub ParseAndSendTrackbacks(ByVal post As PostItem)
    'find references that support trackbacks
    Dim pattern As String = String.Format("href={0}[a-z0-9:\.\/_\?\-\%]*{0}", _
                                          ControlChars.Quote)
    Dim matches = Regex.Matches(post.Description, pattern, RegexOptions.IgnoreCase)
    For Each Link In matches.OfType(Of RegularExpressions.Match)()
        Try
            Dim startIndex = Link.Value.IndexOf(ControlChars.Quote) + 1
            Dim urlPart As String = Link.Value.Substring(startIndex, _
                                        Link.Value.Length - startIndex - 1)
            Dim svc As New TrackbackService
            svc.SendTrackback("http://www.ThinqLinq.com/Default/" & _
                              post.TitleUrlRewrite & ".aspx", post, urlPart)
        Catch ex As Exception
            Trace.Write(ex.ToString)
        End Try
    Next
End Sub

As you can see, this consists mostly of the Regex match and some string parsing to get down to the url that we are referring to. We then send that url into a method which will send the trackback.


Public Sub SendTrackback(ByVal postUrl As String, ByVal post As PostItem, ByVal externalUrl As String)

    Dim server = GetTrackbackServer(externalUrl)
    If server <> "" Then
        'Send the trackback to the sever
    End If
End Sub

Here, the first step is to see if the post that we are referencing in our post supports the trackback API. The trackback API has an auto-discovery mechanism whereby the page needs to include an embedded RDF which specifies the URI that a client should ping when it wants to issue a trackback. On this site, the trackback server URI for my Receiving Trackbacks post is:

<!--
    <rdf:RDF xmlns:rdf=http://www.w3.org/1999/02/22-rdf-syntax-ns#
      xmlns:dc=http://purl.org/dc/elements/1.1/
      xmlns:trackback="http://madskills.com/public/xml/rss/module/trackback/">
      <rdf:Description rdf:about=http://www.ThinqLinq.com/About.aspx
       dc:identifier="http://www.foo.com/Default.aspx" dc:Title="Thinq Linq"
       trackback:ping="http://ThinqLinq.com/Trackback.aspx?id=22050" />
    </rdf:RDF>
-->

In this case, notice that the XML is included inside of a comment block. The TrackBack Technical Specification indicates that this is required by some validators. Since we know that the site needs to include a node like this, we can issue a request to the page of the post that we are wanting to send a post to and see if it includes the appropriate rdf response information. We can't assume that the page will be XHTML compliant, so we will use another regular expression to find the <rdf:RDF></rdf:RDF> node and then use XML literals with VB 9 to parse it to get the value of the trackback:ping. Since the XML is strongly typed with namespaces, we'll start by including the appropriate imports to our class file. If we don't include these imports, our LINQ query will fail because the namespaces are required when querying XML.

Imports <xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#">
Imports <xmlns:dc="http://purl.org/dc/elements/1.1/">
Imports <xmlns:trackback="http://madskills.com/public/xml/rss/module/trackback/">

With the namespaces in place, we can proceed with the rest of our code to find the TrackBack server's URI. This method will return the server's address or an empty string if the server is not found for any reason.

Private Function GetTrackbackServer(ByVal externalUrl As String) As String
    Try
        'Make sure that we have a valid external link
        Dim linkUri As Uri = Nothing
        If Not Uri.TryCreate(externalUrl, UriKind.Absolute, linkUri) Then
            Return ""
        End If
        Dim server As String = ""

        'Set up the HTTP request
        Dim req = DirectCast(WebRequest.Create(linkUri), HttpWebRequest)
        req.Referer = "http://www.thinqLinq.com"
        Dim client As New WebClient()
        client.UseDefaultCredentials = True

        'Get the page's contents
        Dim page = client.DownloadString(externalUrl)

        'Find the rdf tag
        Dim regexString = String.Format("<rdf:rdf(.*?)rdf:rdf>")
        Dim match = Text.RegularExpressions.Regex.Match(page, regexString, _
RegexOptions.IgnoreCase Or RegexOptions.Singleline).Value If Not String.IsNullOrEmpty(match) Then 'Use LINQ to XML to fet the trackback:ping attribute's value Dim rdf = XDocument.Parse(match) Dim url = rdf.Root.<rdf:Description>.FirstOrDefault.@trackback:ping If Not String.IsNullOrEmpty(url) Then Return url End If End If Catch ex As Exception Diagnostics.Trace.WriteLine(ex.ToString()) End Try 'Something didn't work right, or the site doesn't support TrackBacks. Return "" End Function

Now that we know the server, we can finish off the process of sending our request to the server that we started earlier. In this case, we create the request setting the method to "POST" and the ContentType to "application/x-www-form-urlencoded". We then build our form's values that we make sure to UrlEncode and place in the request's content stream.

Public Sub SendTrackback(ByVal postUrl As String, ByVal post As PostItem, ByVal externalUrl As String)

    Dim server = GetTrackbackServer(externalUrl)
    If server <> "" Then
        Dim req = DirectCast(WebRequest.Create(server), HttpWebRequest)
        req.Method = "POST"
        req.ContentType = "application/x-www-form-urlencoded"
 
        Dim content As New StringBuilder()
        content.AppendFormat("url={0}", HttpUtility.UrlEncode(postUrl))
        content.AppendFormat("&title={0}", HttpUtility.UrlEncode(post.Title))
        content.Append("&blog_name=ThinqLinq")
        content.AppendFormat("&excerpt={0}", _
                             HttpUtility.UrlEncode(StripHtml(post.Description, 200)))

        Dim contentBytes() = System.Text.Encoding.ASCII.GetBytes(content.ToString())
        req.ContentLength = contentBytes.Length

        Using reqStream = req.GetRequestStream
            reqStream.Write(contentBytes, 0, contentBytes.Length)
        End Using

        Dim resp = req.GetResponse()
        Using respStream = resp.GetResponseStream()
            Dim reader As New StreamReader(respStream)
            Dim response = XDocument.Parse(reader.ReadToEnd())
            'Check for errors
            If CDbl(response...<error>.FirstOrDefault()) > 0D Then
                Throw New InvalidOperationException(response...<message>.FirstOrDefault().Value)
            End If
        End Using
    End If
End Sub

Once we send the request, we make sure to check the response to see if there were any errors. As we mentioned yesterday, the TrackBack Technical Specification requires that the response be XML in the following format:

<?xml version="1.0" encoding="utf-8"?>
<response>
   <error>1</error>
   <message>The error message</message>
</response>

Since it is XML, it is easy to parse this using LINQ to XML. If an error is found, we raise an exception including the contained error message from the server.

Posted on - Comment
Categories: VB - VB Dev Center - LINQ - SEO -

Receiving Trackbacks

If you've been following along, I've been working on enhancing this site a bit recently. A couple of the most recent enhancements can be found in the following posts:

Continuing in this tradition, I wanted to include the ability to be notified when other sites post links to my posts. There are several such API's that support this kind of notification, including Trackbacks and Pingbacks. In this post, we'll look at receiving Trackbacks and saving them to our database (using LINQ of course).

The technical specification for Trackbacks is hosted at http://www.movabletype.org/documentation/developer/callbacks/ . To implement the listener part of the trackback, we simply need to be able to accept a HTTP Post which includes the post id in the URI and the trackback's details as form parameters. For example, if I wanted to send a trackback for my Paging with AJAX post, I would issue the following request:

POST http://ThinqLinq.com/Trackback.aspx?id=22040
Content-Type: application/x-www.form-urlencoded; charset=utf-8

title=Test+Postback&url=http://www.fansite.com/&excerpt=Good+post&blog_name=Fansite

In this request, the URI specifies the name of the trackback server including the ID of the post we are tracking: http://ThinqLinq.com/Trackback.aspx?id=22040. The trackback API specifies that the Content-Type should always be application/x-www.form-urlencoded; charset=utf-8. The body specifies the values we want the trackback server to know about. Each of these are optional. In this case, we want to send the server a trackback for a post called "Test Postback" which can be found at http://www.fansite.com on the blog that is named "Fansite". To be nice, we'll include an excerpt of our post with the value of "Good post". Because the content type needs to be urlencoded, we need to make sure that each value is encoded properly. To summarize, following parameter values would be sent for our test request:

  • title=Test Postback
  • url=http://www.fansite.com/
  • excerpt=Good Post
  • blog_name=Fansite

One tool to issue a raw HTTP request to test this is Fiddler. Inside Fiddler, we can select the Request Builder tab. Select "POST" as the method and enter the URI that we wish to post to (http://ThinqLinq.com/Trackback.aspx?id=22040). We then set the content type in the Request Headers and our post content in the Request Body. Below is a screen shot to send this request using Fiddler.

TrackbackFiddler

We'll discuss what we need to do to send this request from code in a later post. For now, we'll focus on receiving this request. While I wanted to do this with WCF, implementing it as a standard Web page is the easiest. As a result, we'll create a page called Trackback.aspx. Since there is no real UI on this, we'll remove everything from the page except for the pointer to the page's code file:

<%@ Page Language="VB" AutoEventWireup="false" CodeFile="Trackback.aspx.vb" Inherits="Trackback" %>

On the page load, we need to clear our buffer because we're going to just send a simple XML response indicating any errors we may experience.

    Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
        Response.Buffer = True
        Response.Clear()
        Response.Cache.SetCacheability(HttpCacheability.NoCache)
        Response.ContentType = "text/xml"

Next, we need to create the object that we want to persist in our database. We already have a Comment table which maps to our Comments. In many ways Trackbacks are like Comments, so we'll use the same table. Since they do have different behaviors, we'll use LINQ to SQL's ability to support inheritance. Thus we'll add a discriminator column on our Comment table called CommentType and default the value to "C" since most of the items in that table will be comments.

Next, in our DBML we need to indicate the new mapping. Our existing comment class will now be an abstract base class called CommentBase. We'll change it's Inheritance Modifier to MustInherit as a result. We'll then add three derived types for Comment, Trackback, and Pingback to model the behaviors we'll be implementing now. We'll also need to make sure to Inheritance Discriminator values of each of the inheritance arrows. When we're done, this piece of our model now looks as follows:

TrackbackDbml

Now that we have our new Trackback type, we can instantiate it on our Page_Load with the form values that we get from our page's request fields.

      Dim trackback As New LinqBlog.BO.Trackback
      Dim id As Integer
      If Not Integer.TryParse(Request("id"), id) Then
          Return CreateFailureMessage("Id invalid")
      End If

      trackback.PostId = ID
      trackback.Creator = If(Request.Params("blog_name"), "")
      trackback.CreatorEmail = String.Empty
      trackback.Description = _
          String.Format("Trackback from &lt;a href={0}{1}{0}&gt;{2}&lt;/a&gt", _
                        ControlChars.Quote, _
                        If(Request.Params("url"), ""), _
                        HttpUtility.UrlDecode(If(Request.Params("title"), "")))
      trackback.EnteredDate = Now
      trackback.CreatorLink = If(Request.Params("url"), "")

We do need to decode the Url encoded values. We also need to watch out for missing values because each of the form values are optional. In this case, we'll use VB 9's support for ternary If. Also notice here that we aren't setting the CommentType field. Since we are using the discriminator column and set that in LINQ to SQL's metadata, when we save Trackback objects, it will automatically assign the type to "T" for us.

Now that we have our trackback object, adding it to the database with LINQ is trivial. However, we do need to make one additional check before saving the trackback. We don't want to save multiple trackbacks for a single post. Thus we'll check to see if there are any trackbacks for this post in the database already before saving this one:

            Using dc As NewLinqBlogDataContext
                'Make sure we don't already have a comment for this post from this url
               
If(Aggregatetb Indc.CommentBases.OfType(OfTrackback)() _
                   Wheretb.PostId = id And_
                        tb.CreatorLink = values.Url _
                    IntoCount()) = 0 Then

                   
'Add it
                   
dc.CommentBases.InsertOnSubmit(trackback)
                    dc.SubmitChanges()

                End If
            End Using

We're almost done. The last step is to send the appropriate response to the sending system as required by the Trackback API specification. This is a simple XML response containing any error messages. With XML Literals, this is a simple copy paste operation from the Trackback API:

 

    Private Function CreateSuccessMessage() As String
        Return <?xml version="1.0" encoding="utf-8"?>
               <response>
                   <error>0</error>
               </response>.ToString()
    End Function

    Private Function CreateFailureMessage(ByVal description As String) As String
        Return <?xml version="1.0" encoding="utf-8"?>
               <response>
                   <error>1</error>
                   <message><%= description %></message>
               </response>.ToString()
    End Function

With this in place, we just add the following inside of our If statement after we save the value to the database:

    Response.Write(CreateSuccessMessage)
    Response.End()

If there were any issues in creating or saving our trackback, we can send the Response.Write(CreateFailureMessage(error)).

Feel free to test this by tracking back to this post. Be aware that I am moderating the trackbacks to avoid comment spam that can be inevitable when exposing public known API's like this. Please don't abuse the ability to send trackbacks as I don't want to have to remove the functionality.

Posted on - Comment
Categories: VB - VB Dev Center - LINQ - SEO -

Entity Framework Samples in Visual Basic

For those Visual Basic users out that that have struggled with the fact that the samples were only available in C#, you can now rejoice. There are a number of projects that have now been translated into Visual Basic for your learning pleasure. You can find these samples on the MSDN Code Gallery’s Entity Framework page. At this point the following projects have been translated.

  • Entity Framework Query Samples Compatible with .NET Framework 3.5 SP1 and Visual Studio 2008 SP1 (Visual Basic and C# versions available)
    The Entity Framework Query Samples is a small Windows Forms program that contains several basic Entity SQL and LINQ to Entities queries against that NorthwindEF Entity Data Model (based on a modified version of Northwind). Its goal is to help you learn the features of the two query languages supported by EF and visualize how the results and the translated store query look like.
  • Entity Framework Lazy Loading Compatible with .NET Framework 3.5 SP1 and Visual Studio 2008 SP1 (Visual Basic and C# versions available)
    This sample shows how to use code generation to add support for transparent lazy loading to Entity Framework. It includes code generator (EFLazyClassGen), supporting library (Microsoft.Data.EFLazyLoading) and sample test applications.
  • Persistence Ignorance (POCO) Adapter for Entity Framework V1 Compatible with .NET Framework 3.5 SP1 and Visual Studio 2008 SP1 (Visual Basic and C# versions available)
    EF POCO Adapter enables Plain Old CLR Objects (POCOs) to be tracked using released version of Entity Framework V1 using automatically generated adapter objects. It consist of a code generator, supporting library and a test suite and examples.

There are several more on the way. I’ll try to update this post when they become available.

5/22: Update: Two more samples went online today:

  • EF Extensions Compatible with .NET Framework 3.5 SP1 and Visual Studio 2008 SP1 (Visual Basic and C# versions available)
    The ADO.NET Entity Framework Extensions library includes utilities that make querying stored procedures, creating typed results from DB data readers and state tracking external data much easier in the Entity Framework. A sample application demonstrates several patterns using these utilities, including stored procedures with multiple result sets, materialization of CLR types, and registering entities in the Entity Framework state manager.
  • ADO.NET Data Services IUpdateable implementation for Linq to Sql
    Sample implementation of ADO.NET Data Services IUpdateable interface for Linq to Sql Data Sources.
Posted on - Comment
Categories: Entity Framework - VB - VB Dev Center - LINQ - ADO.Net Data Services -

Paging with AJAX WCF and LINQ

These days, it seems that every web site needs to have some use of gratuitous AJAX in order to stay on the bleeding edge. Since we didn't have any here yet, I thought I would throw some in for good measure. I liked the lazy loading of records instead of paging found on some sites, including the Google RSS reader and thought I would see what it would take to add something like that here.

If you're not familiar with this paging option, instead of loading a new page of records each time the user gets to the end, they simply add to the end of the list via an AJAX (Asynchronous JavaScript and XML) call to get more records. My implementation is not nearly as fancy, but it gets the job done.

To try out the AJAX implementation we're going to discuss, browse to the AjaxPosts.aspx page. The implementation consists of three parts:

  • The web page that hosts the AJAX ScriptManager and provides the foundation of the page itself.
  • A JavaScript file which performs the client side paging functionality.
  • A WCF service which fetches the data and formats it for the client.

We'll start with the hosting page. The page itself is very simple. We'll make it easy to handle the formatting by continuing to use the same MasterPage that we use elsewhere on this site. That makes the content section rather concise.

The content section includes a ScriptManager which serves to push the AJAX bits down to the client. It also contains knowledge of our service using the asp:ServiceReference tag, and the client side JavaScript using the asp:ScriptReference tag. In addition, we include a blank div element which we will use to insert the content fetched from our service, and a button which is used to initiate requests for more posts from our service.

<%@ Page Language="VB" AutoEventWireup="false" CodeFile="AjaxPosts.aspx.vb" 
Inherits="AjaxPosts" MasterPageFile="~/Blog.master" Title="ThinqLinq" %> <%@ Register
Assembly="System.Web.Extensions, Version=3.5.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35" Namespace="System.Web.UI" TagPrefix="asp" %> <asp:Content ID="content" runat="server" ContentPlaceHolderID="ContentPlaceHolder1"> <asp:ScriptManager ID="ScriptManager1" runat="server"> <Services> <asp:ServiceReference Path="~/API/AjaxServices.svc" /> </Services> <Scripts> <asp:ScriptReference Path="~/AjaxServiceClient.js" /> </Scripts> </asp:ScriptManager> <div id="Posts"> </div> <button id="More" onclick="FetchPosts(); return false;">More Posts</button> </asp:Content>

With the plumbing out of the way, we can start to focus on the meat of the AJAX implementation. First off, we need to have a way to access our data. My first inclination was to use ADO.Net Data Services to serve up dynamic data from our LINQ sources. However in this case, I decided that the functionality was quite limited in scope and a dedicated service would be the better option. A dedicated service would also offer some additional level of caching should scalability become an issue on the site. Also, I wasn't sure my JavaScript skills would be quite suited to doing the necessary client side string parsing that would be necessary to build dynamic HTML content from the raw data. Thus in this case I decided to use a more standard WCF service.

Our WCF service takes as parameters the page number that we want to retrieve and the number of records that make up a page. We'll have our JavaScript pass these values. In response, it will send back a string containing the already formatted HTML that we will display. First we need to set up our class and method and decorate them with the necessary WCF attributes. When adding services that you want to AJAX enable, make sure to use the  "AJAX Enabled WCF Service" template from the Add New Item dialog box rather than the standard "WCF Service". This will set up the web.config differently to allow the service to expose a way to access a JavaScript client proxy. (Thanks goes to Wally for helping me figure that one out. For our service, we'll put it in the ThinqLinq namespace and call it AjaxServices. It will have one service method (indicated by the OperationContract attribute) called LoadFormattedPosts.


Namespace ThinqLinq
    <ServiceContract(Namespace:="ThinqLinq")> _
    <AspNetCompatibilityRequirements(RequirementsMode:=AspNetCompatibilityRequirementsMode.Allowed)> _
    Public Class AjaxServices

        <OperationContract()> _
       Public Function LoadFormattedPosts(ByVal page As Integer, ByVal pagesize As Integer) As String
       End Function
    End Class
End Namespace

Inside the LoadFormattedPosts we'll get to use our LINQy goodness. In this case, we know that we are going to need our posts with their associated categories and comments. As an optimization, we'll add the appropriate load options to fetch all of those at the same time. We'll also include the appropriate Skip and Take methods to do the paging. Our resulting LINQ to SQL query is fairly standard.


Using dc As New LinqBlogDataContext
    Dim LoadOptions As New DataLoadOptions
    LoadOptions.LoadWith(Of PostItem)(Function(p) p.Comments)
    LoadOptions.LoadWith(Of PostItem)(Function(p) p.CategoryPosts)
    LoadOptions.LoadWith(Of CategoryPost)(Function(cp) cp.Category)
    dc.LoadOptions = LoadOptions

    Dim posts = From p In dc.PostItems _
                Order By p.PublicationDate Descending _
                Skip pagesize * page _
                Take pagesize

    Dim response As String = FormatPosts(posts)
    Return HttpUtility.HtmlDecode(response)
End Using

This query would be fine if we only wanted to send the raw data back to the AJAX client. However, we'll take an extra step in this implementation and do the formatting in a separate FormatPosts method. We'll actually add a separate method to format the Categories as well.


Private Function FormatPosts(ByVal posts As IEnumerable(Of PostItem)) As String
  Dim response = _
      From p In posts _
      Select val = <div class="post">
                     <h2>
                        <a href=<%= "Default/" & p.TitleUrlRewrite %>><%= p.Title %></a>
                     </h2>
                     <div class="story"><%= p.Description %></div>
                     <div class="meta">Posted on <%= p.PublicationDate %> - 
                        <a href=<%= "Default/" & p.TitleUrlRewrite %>>
                        Comments (<%= p.Comments.Where(Function(c) c.IsApproved).Count() %>)</a>
                        <br/>
                        <%= If(p.CategoryPosts.Count > 0, _                               "Categories:" & FormatCategories(p), _                               "") %>
                     </div>
                   </div>.ToString() _
        Select val  Return "<div class='posts'>" & String.Join("", response.ToArray) & "</div>"
End Function
Private Function FormatCategories(ByVal post As PostItem) As String
  If post.CategoryPosts.Count > 0 Then
    Dim response = _
       From catPost In post.CategoryPosts _
       Select val = <span>
                      <a href=<%= "http://www.ThinqLinq.com/Default.aspx?CategoryId=" & _                              catPost.Category.CategoryId %>>
                         <%= catPost.Category.Description.Trim %></a>
                    </span>.ToString()

    Return String.Join(", ", response.ToArray())
  Else
    Return ""
  End If
End Function

I'm not going to take the time to explain all of the LINQ to XML implementation here. I've discussed LINQ to XML in past posts. I do want to point out here a couple extra steps that need to be taken with this implementation.

First, we need to be careful in cases where we have posts without associated comments or categories. If we don't, we will run into issues with null results and the query will fail. Second, we have to watch how and when we are casting to strings and arrays. The easiest way to make sure that we are casting properly is to make sure you have Option Strict turned On at the file or project level.

Now that we have our service set-up, we can write the client side JavaScript code. Because we're using the AJAX ScriptManager and an AJAX enabled WCF service, we don't have to write any fancy plumbing code to access this service. To test this, we can browse to the service passing a /js or /jsdebug flag as follows: http://ThinqLinq.com/Api/AjaxServices.svc/js. That leaves just the task of writing the client side to access the service and place it in the AjaxServiceClient.js file for the ScriptManager to find.


var svcProxy;
var currentPage;
function pageLoad() {
    currentPage = 0;
    svcProxy = new ThinqLinq.AjaxServices();
    svcProxy.set_defaultSucceededCallback(SucceededCallback);
  
    FetchPosts();
}

function FetchPosts() {
    svcProxy.LoadFormattedPosts(currentPage, 5);
}

function SucceededCallback(result) {
    var postTag = document.getElementById("Posts");
    postTag.innerHTML += result;
    currentPage += 1;
}

if (typeof (Sys) !== "undefined") Sys.Application.notifyScriptLoaded();

In this JavaScript, we set up two global variables: currentPage to manage the paging functionality and svcProxy to act as the instance of the proxy that accesses our service. In the pageLoad function, we initialize these values. Once initialized, we then set the asynchronous callback function for when values are retrieved. Finally, we invoke the method to fetch the posts for the first time that the page is loaded.

To fetch the posts, we simply call the LoadFormattedPosts method of our proxy which sends the request to our WCF service. Because this is performed asynchronously, we will just fire the request and let the callback handle the response.

In the SucceededCallback method, we grab the return value from the WCF service in the result parameter. Once we have that, we get a reference to the placeholder "Posts" div in the AjaxPosts.aspx document. To add the new results to the client, we concatenate the current contents with the new result value using +=. Finally, we increment the currentPage number so that the next request will fetch the next page of posts.

That's it. We're done. Jump over to http://ThinqLinq.com/AjaxPosts.aspx to see the result in action. There are a number of things that can be improved on this implementation, but it is a start. One definite drawback on this implementation is that it is not SEO friendly. You can see this by viewing the source to the resulting page. Notice that none of the post contents are included in the source.

I don't claim to be a JavaScript or AJAX expert and I'm sure there are other more elegant solutions. I'd love to learn from your experience, so feel free to post your recommendations and we'll see what we can do to improve this.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB - Ajax - WCF -

Adding Gravatar support to comments

Having a bit more free time than expected, I thought I would take a bit of time and add some features to this site based on some things I've seen at other sites. A quick one is adding Gravatar support to the comments. If you're not familiar with gravatar's, www.Gravatar.com describes them as a globally recognized avatar, is quite simply an image that follows you from site to site appearing beside your name when you do things.

To get a Gravatar, go to Gravatar.com and let them know your email address and the picture you want associated with your picture. Once you are done, any site that supports gravatar's will then show your picture to the world. That's your task.

My task is to enable this site to access your picture. They have a number of add-ins for various blogging engines, but since this one is custom, we'll have to implement it ourself. Luckly it is as simple as adding a img tag to our site with the source being a URI which includes information about the user that we want to display. Here's a sample URI which would show my picture:

<img src="http://www.gravatar.com/avatar.php?gravatar_id=58d453f6449cc9125948bd153bc4272b&rating=G&size=40" alt="Gravatar" />

Let's break the source attribute down a bit. Essentially it is a URI to a PHP site on the Gravatar server: http://www.gravatar.com/avatar.php. It includes three query string parameters: the gravatar_id, rating and size.

The rating and size are easy enough. For this site, we're going to keep it clean (although we do get Kinq-y at times) so we'll keep the site rated G. Other sites could use one of their other rating levels (g, pg, r, or x).

For the size, you can specify that the image be anywhere between 0 and 512 pixels. To keep the page load small here, I'll ask for images 40 px by 40 px and set the size=40.

With that out of the way, we need to generate the value for the gravatar_id paramter. In a nutshell, the id is just a MD5 hash of the commentor's email address. When we set-up the ability to add comments to this site, we made the email address a required field, so we are already storing that. All we need to do is convert it and bind to it in our custom img tag. To encapsulate that functionality and keep it with the comments themselves, we will add a partial class for Comments and put our new property in there. We don't add it directly to the Comment class that the dbml file generates as it will be deleted in the future if we ever decide to regenerate that file. With partial classes, we can retain parts of the class definition in multiple physical files and the compiler will combine them within the generated assembly. Here's the definition of this class and our new property:


Public Class Comment
    Public ReadOnly Property GravatarSource() As String
        Get
        End Get
    End Property
End Class

Notice here, we don't need to specify that this is a partial class because the one generated by our DBML designer already includes the partial destinction. As long as we're in the same namespace, in VB, we're fine. I should point out however that there are some limitations on how we can use this partial property in LINQ queries (see http://www.thinqlinq.com/Default/Projecting-into-an-unmapped-property-from-a-LINQ-to-SQL-query.aspx).

Now for the getting this value. To make binding simple, we'll just format the entire URI in this method (we're using a property here due to limitations in data binding to methods). Using the String.Format method, we can insert the hash into our uri using the following:

Return String.Format("http://www.gravatar.com/avatar.php?gravatar_id={0}&rating=G&size=40", GetEmailHash())

The body of the GetEmailHash function is where the meat of our work happens. In this, we will encode the value of the Commentor's email address which we can access from the other part of the partial class as the CreatorEmail property. To do that, we need to encode the string into a byte array. Then, using a MD5CryptoServiceProvider instance, we can compute the hash into a new byte array.


Dim enc As New UTF8Encoding()
Dim hashProvider As New MD5CryptoServiceProvider
Dim bytes() As Byte = hashProvider.ComputeHash(enc.GetBytes(Me.CreatorEmail))

Finally, we need to piece the encrypted array back into a string. In the case of the Gravatar system, each byte needs to be converted back to the HEX represntation and lower cased. We could use a for-each loop and iterate over the results building it up dynamically, but this is a great case of using LINQ to Objects to replace an iteration:

From b In bytes Select b.ToString("X2").ToLower()

We can then concatenate the resulting HEX strings using the String.Join. Here's the completed definition of this class:


Imports System.Security.Cryptography
Imports System.Text
Imports System.IO 

Public Class Comment
    Public ReadOnly Property GravatarSource() As String
        Get
            'We need the MD5 hash of the email address
            Return String.Format("http://www.gravatar.com/avatar.php?gravatar_id={0}&rating=G&size=40", GetEmailHash())
        End Get
    End Property
    Private Function GetEmailHash() As String
        Dim enc As New UTF8Encoding()
        Dim hashProvider As New MD5CryptoServiceProvider
        Dim bytes() As Byte = hashProvider.ComputeHash( _
                           enc.GetBytes(Me.CreatorEmail))
        Return String.Join("", _
                           (From b In bytes _
                            Select b.ToString("X2").ToLower()) _
                            .ToArray())
    End Function
 End Class

Now, to add this property to our UI. Since we are already set up to bind to the comment object in our CommentRepeater control, we just add a new line to specify the img tag:

<a href="http://www.gravatar.com" title="Get your avatar"><img width="40" height="40" style="float: right; padding-left: 10px;" src="<%# Eval("GravatarSource") %>" alt="Gravatar" /></a>

That's it. If you want to see the gravatar in action, head on over to their site and sign up. Then come back here and leave a comment on this post. I'd love to see the faces of people who enjoy this site.
Posted on - Comment
Categories: VB Dev Center - LINQ - VB - SEO -

Delegates and Lambdas in VB 10

Yesterday, I had the pleasure of demonstrating Delegates, Lambdas and Expressions at the Alabama Code Camp. This is not the first time I presented this. The original demo project is still available in the Downloads section of this site. This time however, I was able to round off the VB set of the demos to demonstrate all of the abilities available in C# 3.0 due to the new Statement and Multi-line Lambda language enhancements coming in VB 10. The complete VB 10 version of the project is available to download now as well.

Here are some of the added methods. (Note, this has been tested against the October 2008 beta release of Visual Studio 2010. Be aware of the evaluation time bomb in this release if you intend to use it.) The DoTrick method takes an Action delegate which means that it takes no parameters and returns no values. In VB this equates to a Sub and in VB 10 is refered to as a Statement Lambda.

'Anonymous delegate
animal.DoTrick(Sub() Console.WriteLine("Anonymous Method"))

' Lambda expression as a variable
Dim LiftOneLeg As TrickAction =
   Sub()
      animal.LegsOnGround -= 1
   End Sub
animal.DoTrick(LiftOneLeg)

'Lambda expression inline
animal.DoTrick(Sub()
      animal.LegsOnGround += 3
   End Sub)

'Multi-line lambda
animal.DoTrick(Sub()
      animal.LegsOnGround = 0
      Console.WriteLine("I'm Jumping now!")
      animal.LegsOnGround = 4
   End Sub)

I also included a nice consise combination of VB 9 function lambdas with VB 10 Statement lambdas. In this case, I find the animals that have broken business rules (Predicates which are delegates that return a boolean) and then iterate over them using the ForEach method that takes an Action delegate which can be fulfilled with a statement lambda.

animals _
   .Where(Function(item) Not item.IsValid).ToList _
   .ForEach(Sub(item) Console.WriteLine(item.ToString))

For this update, I also include a demonstration of a common use of Lambda's, particularly in Silverlight applications where all data access is asynchronous. Since all delegates actually inherit from MultiCastDelegate, all delegates by default offer asynchronous BeginInvoke/EndInvoke possibilities in addition to the Synchronous Invoke method.

With BeginInvoke, we use a lambda that takes an IAsynchResult object as a parameter. In the past we had to declare the call back as a separate method. With the inclusion of multi-line statement lambdas, we can now declare the callback directly inline with the BeginInvoke request as follows:

Public Sub BeginTrick(ByVal trick As Action)
   trick.BeginInvoke(Sub(result)
         Thread.Sleep(1000)
         Console.WriteLine("...Trick Completed for " & ToString())
      End Sub, Me)
End Sub

If you missed this presentation, I'll be doing it again at the South Florida Code Camp next week. Come on down and check it out.

Posted on - Comment
Categories: Code Camp - VB Dev Center - VB -

LINQ supported data types and functions

When we were writing LINQ in Action, we weren't able to specify all of the possible methods and functions that have supported query translations for a couple reasons.

  1. There were too many to be included in the scope of the book.
  2. The book was being written at the same time that LINQ was evolving and more comprehensions were being supported, thus giving us a moving target that we couldn't ensure the accuracy of when the product shipped.
  3. We realized that over time, translations for more functions may be added and enumerating the list in the book might not reflect the current methods supported with a given framework version.

As I was searching for an answer to a recent question, I happened upon a listing on MSDN showing the functions and methods which are and are not supported. The full list of LINQ to SQL supported and unsupported members is available online at http://msdn.microsoft.com/en-us/library/bb386970.aspx.

As an example the following methods are shown as having translations for DateTime values: Add, Equals, CompareTo, Date, Day, Month, Year. In contrast methods like ToShortDateString, IsLeapYear, ToUniversalTime are not supported.

If you need to use one of the unsupported methods, you need to force the results to the client and evaulate them using LINQ to Objects at that point. You can do that using the .AsEnumerable extension method at any point in the query comprehension. Any portion of the query that follows AsEnumerable will be evaluated on the client side.

Posted on - Comment
Categories: VB Dev Center - LINQ -

Adding a dynamic SiteMap for search engine optimization using LINQ

A couple months ago I added a feature to this site to build a Site Map for this site dynamically based on the information from the database for posts and files for the downloads. If your not familiar with how Sitemap files can help your site searchability, Google has a good documentation about Sitemaps in their Webmaster tools.

The SiteMap Protocal is a rather simple XML document consisting of a set of url nodes that consist of the following:

  • loc - URL for the page link
  • lastmod - Date the page was last modified
  • changefreq - How often the page is changed
  • priority - How high you think the page should be ranked relative to other pages on your site.

For this site, I decided to index the main (default.aspx) File, about and contact pages. In addition, I indexed each post as a separate url node. If you want to view the resulting data, browse to http://www.thinqlinq.com/sitemap.aspx. To do this, I used LINQ to XML with VB XML Literals. To begin, we need to add the XML Namespaces. At the top of our file, we enter the following imports:

Imports <xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
Imports <xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

We continue by adding the root urlset node and one child node representing the main page:

Dim map = _

    <?xml version='1.0' encoding='UTF-8'?>

    <urlset xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

        xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9

            http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"

        xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

 

        <url>

            <loc>http://www.ThinqLinq.com/default.aspx</loc>

            <lastmod>

               <%= (From p In dc.PostItems _

                    Order By p.PublicationDate Descending) _

            .First.PublicationDate.ToString("yyyy-MM-dd") %>

            </lastmod>

            <changefreq>daily</changefreq>

            <priority>1.0</priority>

        </url>

    </urlset>

 

Most of this is standard XML. The main difference is the use of a LINQ query to show the last modification date based on the most recent post from our database. In this case we just want the First date when the dates are ordered descending. We do need to format it properly so that our search engine (Google) will be able to recognize it.

Next up, we need to add the link for the Downloads page. We'll do this much the same way that we added the url for the default page. However, in this case the modification date won't come from the database, but rather use a LINQ to Objects query to get the most recent file in the downloads directory on this site.

<url>

    <loc>http://www.ThinqLinq.com/Files.aspx</loc>

    <lastmod>

        <%= (From f In New System.IO.DirectoryInfo( _

            Server.MapPath("~/Downloads")).GetFiles _

            Order By f.LastWriteTime Descending) _

            .FirstOrDefault.LastWriteTime.ToString("yyyy-MM-dd") %>

    </lastmod>

    <changefreq>weekly</changefreq>

    <priority>1.0</priority>

</url>

The About and Contact pages are relatively straight forward. The remaining url nodes are generated based on the records in the PostItems from our database. To populate them, we'll create a LINQ query pulling the data from the database using LINQ to SQL and projecting (Select) out individual url nodes for each row in the database:

<%= From p In dc.PostItems.ToList _

    Select <url>

               <loc>http://www.ThinqLinq.com/default/<%= p.TitleUrlRewrite %>.aspx</loc>

               <lastmod><%= p.PublicationDate.ToString("yyyy-MM-dd") %></lastmod>

               <changefreq>daily</changefreq>

               <priority>0.3</priority>

           </url> %>


As you can see, there isn't much here that is overly complex. It's just a series of LINQ queries filling the data from various sources. For reference purposes, Here's the complete code:

Imports <xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

Imports <xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">

 

Partial Class SiteMap

    Inherits System.Web.UI.Page

 

    Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load

        Response.Buffer = True

        Response.Clear()

        Response.Cache.SetCacheability(HttpCacheability.NoCache)

        Response.ContentType = "text/xml"

        Response.AddHeader("Content-Disposition", "inline;filename=blog.rss")

        WriteRss()

        Response.End()

    End Sub

 

    Private Sub WriteRss()

        Try

            Using dc As New LinqBlog.BO.LinqBlogDataContext

                Dim map = _

                    <?xml version='1.0' encoding='UTF-8'?>

                    <urlset xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
       
xsi:schemaLocation="http://www.sitemaps.org/schemas/sitemap/0.9 http://www.sitemaps.org/schemas/sitemap/0.9/sitemap.xsd"

                        xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">

 

                        <url>

                            <loc>http://www.ThinqLinq.com/default.aspx</loc>

                            <lastmod>

                                <%= (From p In dc.PostItems _

                                    Order By p.PublicationDate Descending) _

                                    .First.PublicationDate.ToString("yyyy-MM-dd") %>

                            </lastmod>

                            <changefreq>daily</changefreq>

                            <priority>1.0</priority>

                        </url>

                        <url>

                            <loc>http://www.ThinqLinq.com/Files.aspx</loc>

                            <lastmod>

                                <%= (From f In New System.IO.DirectoryInfo( _

                                    Server.MapPath("~/Downloads")).GetFiles _

                                    Order By f.LastWriteTime Descending) _

                                    .FirstOrDefault.LastWriteTime.ToString("yyyy-MM-dd") %>

                            </lastmod>

                            <changefreq>weekly</changefreq>

                            <priority>1.0</priority>

                        </url>

                        <url>

                            <loc>http://www.ThinqLinq.com/about.aspx</loc>

                            <lastmod>

                                <%= System.IO.File.GetLastWriteTime( _

                                    Server.MapPath("About.aspx")).ToString("yyyy-MM-dd") %>

                            </lastmod>

                            <changefreq>monthly</changefreq>

                            <priority>1.0</priority>

                        </url>

                        <%= From p In dc.PostItems.ToList _

                            Select <url>

                                       <loc>http://www.ThinqLinq.com/default/<%= p.TitleUrlRewrite %>.aspx</loc>

                                       <lastmod><%= p.PublicationDate.ToString("yyyy-MM-dd") %></lastmod>

                                       <changefreq>daily</changefreq>

                                       <priority>0.3</priority>

                                   </url> %>

 

                        <url>

                            <loc>http://www.ThinqLinq.com/Contact.aspx</loc>

                            <lastmod>2008-02-28</lastmod>

                            <changefreq>never</changefreq>

                            <priority>0.1</priority>

                        </url>

                    </urlset>

                Response.Write(map)

            End Using

        Catch ex As Exception

            Response.Write(<error><%= ex.ToString %></error>)

        End Try

 

    End Sub

End Class

Posted on - Comment
Categories: VB Dev Center - LINQ - VB - VS 2008 - SEO - Linq to XML -

Updated source for ThinqLinq now available

It's been a while since I posted some serious content for which I apologize. I started this site a year and a half ago as a proof of concept around LINQ. When the site went live, I included the download version of the site from the presentations I've been doing on LINQ and Asp.Net. In all this time, I've made significant updates to the site, but haven't made them available, until now. If you're interested, you can download the updated bits from the file downloads here and play with them. I did take the liberty of changing the actual production database, removing some of the site metrics, and other sensitive personal information.

As the site has evolved, I tried to document the enhancements. Here's a copy of some of the related links to help you step through some of the changes.

For those that attended my session at the Philly Code Camp 2008.3, this is the version of the project that I was discussing. Of course, there's always more enhancements that I would like to do, including centralizing access to a more common DataContext, using CompiledQueries, and other performance enhancements. If you have any recommendations, I'm all ears to hear what you thinq.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

Can and Should with Legacy Constructs

There are times where you wish you didn't have to worry about legacy code. This is particularly true with programming languages where constructs need to be supported even if they have long outlived their usefullness. Consider the following code that many of us "old-timers" learned to make the TRS-80 in the computer store go into an infinate loop (because the sales person didn't know how to break out of the loop).

Public Shared Sub Main()
     10:
Dim x As String = "This is a test" : Console.Write(x) : GoTo 10
End Sub

Ok, if you're paying attention, this isn't old VB code, but VB.Net (any version). Yes, you can still use line numbers, GoTo and the colon to put multiple statements in a single line. Repeat after me, "Just because you can doesn't mean you should."  Not only is this ugly code to maintain, but it's just sloppy. Please don't code like this.

Posted on - Comment
Categories: VB Dev Center - VB -

LINQ is not an excuse for sloppy code

A couple months ago, I was convinced to try Twitter. In the process, I found www.TweetBeep.com which sends me notifications whenever someone tweets the word LINQ. Today I saw the following:

"my visual studio crashed on retrieving 39,450 records via Linq.. what a shame.. looking for a workaround and a reason.." (name withheld to protect the guilty).

In some ways, this falls into the category of, "Just because you can doesn't mean you should." In this case, the fault lies in the business requirement. There shouldn't be any reason why you should fetch 39,000 records. What user in their right mind would page through that many results?

While admitedly, many demos (including some that I present) show queries such as the following, we do that knowing that the resulting rows will be relatively small.

Dim custs = From c In Customers _
                   Order By c.Name _
                   Select c

One of the great things about LINQ is the compositionality. In this, you can add qualifications to your query (like paging and filtering) without affecting the underlying query. If you are using queries that don't utilize paging and or filtering on your data, make sure you know the underlying data and that returning that many records is reasonable (and isn't likely to grow substantially in the future). As an example, we can extend the above query and add paging as follows:

Dim custs = From c In Customers _
                   Order By c.Name _
                   Select c

Dim pageSize = 10
Dim paged = custs _
                    .Skip(currentPageNumber * pageSize) _
                    .Take(PageSize)

Additionally, I strongly recommend limiting the results prior to fetching using a where clause. There are a number of sites out there that show how to progressivly filter results (for a ComboBox, a TextBox's AutoCompleteSource or the AJAX AutoComplete extender). Extending the functionality so that the suggested items don't appear until the results are sufficiently (depending on your data) filtered is equally easy with LINQ:

If SearchString.Length > 2 Then
     Dim custs = From c In Customers _
                   Where c.Name.StartsWith(SearchString)
                   Order By c.Name _
                   Select c _
                   Take 25
     CustName.AutoCompleteSource = custs
Else
     'Not sufficiently filtered. Keep the suggestion list blank.
     CustName.AutoCompleteSource = New String() {}
End If

In the case of the original question. LINQ isn't at fault here. It is a tool for you to use. Fetching 39,000 records using LINQ or ADO.Net are equally bad ideas. Use the tools to their best effect. Don't get sloppy in your coding practice.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

ThinqLinq on Deep Fried Bytes

Deep Fried Bytes logoBack at DevLinq (er DevLink), I had a chance to sit out on the back porch with Keith and Woody and chat about some of the cool features and rusty washers that you can find in LINQ. Head on over to DeepFriedBytes.com and check out our conversation. You can also download the webcast to listen to on demand. Even better, you can listen on your Zune. As always, let me know what you Thinq.
Posted on - Comment
Categories: VB Dev Center - LINQ - VS 2008 -

Enabling intellisense for LINQ to SQL XML Mapping files

A while back, I showed you how to use an XML Mapping file to enable POCO support with LINQ to SQL instead of requiring you to clutter your class libraries with LINQ to SQL attributes. It turns out, the schema for the XML Mapping files (xmlns="http://schemas.microsoft.com/linqtosql/mapping/2007") may not be included in your install of Visual Studio 2008. It was missing from mine. Luckily, the file is available on MSDN. If you add it to the schemas recognized by Visual Studio, you will get instant intellisense when editing your XML Mapping files. Here's how:

  1. Copy the schema from the Visual Studio 2008 Development Center available here.
  2. Paste it into a new XSD file and save it to your hard drive. If you want to stay consistent with the other schema files, locate your VS 2008 program directory and put it in the \Xml\Schemas directory. On my computer, this is located in the C:\Program Files\Microsoft Visual Studio 9.0\Xml\Schemas directory.
  3. In Visual Studio, Click on XML on the main menu and then select Schemas. This will open the XML Schemas editor window.
  4. Click the "Add..." button. Locate and select the XSD file that you saved in step 2.
  5. Double check that there is a check in the Use column for your newly added schema.

That's it. Now, the next time you're editing your XML Mapping files, you will get the intellisense goodness that Visual Studio has to offer.

Posted on - Comment
Categories: VB Dev Center - LINQ -

Enable the Expression Tree Visualizer in VS 2008

In LINQ in Action, we discuss how to add the LINQ to SQL Query visualizer into the Visual Studio 2008 environment. This tool allows you to open a window during debug time to view the TSQL that is generated from the LINQ expression tree. It also allows you to run the query and view the results. If you're not familiar with it, check out this post by Scott Guthrie.

In addition to the query visualizer, you can also build and install the Expression Tree visualizer, not only as a separate application, but also as an integrated visualizer within Visual Studio 2008. To do this, download the Linq Samples from MSDN Code gallery. Inside of that, you can find a project for the ExpressionTreeVisualizer.  To use it as a stand alone utility, build and run the ExpressionTreeVisualizersApplication. This is the method most people are familiar with.

Building the solution will also build the ExpressionTreeVisualizer library. This is the one you need to use to enable it in Visual Studio natively, copy the generated ExpressionTreeVisualizer.dll library and paste it into your ..\Program Files\Microsoft Visual Studio 9.0\Common7\Packages\Debugger\Visualizers directory.

Once you have placed the library in the visualizers directory, let's see what you can do to use the new visualizer . First, let's build a LINQ to SQL query:


Dim query = From cust In dc.Customers _
            Where cust.City = "London" _
            Order By cust.CompanyName _
            Select cust
Given this query, we need to access the expression object exposed by the IQueryable query object as follows:

Dim queryExpression = query.Expression

Now, that we have our code set-up, set a breakpoint in your code after you have instantiated this queryExpression variable and debug your project. Now, if you hover over the query.Expression method, you'll see a new magnifying glass as shown below:

 

Clicking on the visualizer icon, will launch the visualizer tool revealing the following screen:

 

Sure, there's lots of information in there. The expression trees are quite complex. This tool helps you decipher them in cases where you need to either parse or dynamically create expression trees in your applications.

Posted on - Comment
Categories: VB Dev Center - LINQ - VS 2008 -

Object Identity tracking changes with LINQ to SQL SP1

When we wrote LINQ in Action, we took a bit of time to explain how the identity tracking system worked with LINQ to SQL to make sure that changed objects were retained when subsequent queries are requested from a data context. In a nutshell, when you issue a query, the data context translates the LINQ query into TSQL and sends that to the database. The database returns the rowsets to LINQ to SQL. The provider checks the returned rows against those that it is already tracking from previous fetches and, rather than instantiating the object again, returns the object in its internal store. This is done primarily to ensure that changes a user has made in the course of his context's lifetime are retained rather than being overwritten.

We also discussed (p. 258 if you're following along) how there is a special optimization wherein if you are querying for a single result, the pipeline would check the internal cache first before looking at the database, thus reducing the overhead of repeated hits to the database. An astute reader checked out our claim, and sure enough that optimization did not make it into the RTM bits of VS 2008. We considered fixing this in the second printing, but consulted with the product teams first. It turns out that the intended behavior was indeed to include this optimization, but due to a last minute bug, it didn't make it in.

As Dinesh points out, this oversight has been fixed in SP1. Now, if you try to fetch a single object (using Single, SingleOrDefault, First, or FirstOrDefault), the in memory object cache will be checked based on the identity columns declared in the entity's structure. If a matching object is found, it will be returned, otherwise the record will be requested from the database.

 

Posted on - Comment
Categories: VB Dev Center - LINQ -

Where clause optimized with VB 2008 SP1

There are subtle differences between VB and C# in terms of nullability. This caused a significant difference in the TSQL generated on even simple LINQ queries. Consider the following query from Northwind's Orders table where the Freight column is generated as a Nullable(Of Integer) type:

 Dim filtered = _
       From o In dc.Orders
       Where o.Freight > 100 
       Select o

First the bad news: with the RTM of VB 2008, this query resulted in the following TSQL:

SELECT [t0].[OrderID], [t0].[CustomerID], [t0].[EmployeeID], [t0].[OrderDate], [t0].[RequiredDate], [t0].[ShippedDate], [t0].[ShipVia], [t0].[Freight], [t0].[ShipName], [t0].[ShipAddress], [t0].[ShipCity], [t0].[ShipRegion], [t0].[ShipPostalCode], [t0].[ShipCountry]
FROM [dbo].[Orders] AS [t0]
WHERE (COALESCE(
    (CASE
        WHEN [t0].[Freight] > @p0 THEN 1
        WHEN NOT ([t0].[Freight] > @p0) THEN 0
        ELSE NULL
     END),@p1)) = 1

In this case, I've highlighted the relevant portion of the Where clause. Notice here the simple comparison translates into a COALESCE(CASE WHEN... construct. This was done due to VB's handling of nullable type comparisons under the covers.

AND NOW THE GOOD NEWS:

With the VB 2008 SP1, the difference in nullablity was removed by the SQLProvider used by LINQ to SQL. As a result the above LINQ query now generates the following TSQL:

SELECT [t0].[OrderID], [t0].[CustomerID], [t0].[EmployeeID], [t0].[OrderDate], [t0].[RequiredDate], [t0].[ShippedDate], [t0].[ShipVia], [t0].[Freight], [t0].[ShipName], [t0].[ShipAddress], [t0].[ShipCity], [t0].[ShipRegion], [t0].[ShipPostalCode], [t0].[ShipCountry]
FROM [dbo].[Orders] AS [t0]
WHERE [t0].[Freight] > @p0

Notice in this case the where clause direcly reflects the statement ffrom the LINQ query and the performance from SQL Server's perspective is GREATLY improved! Who says LINQ to SQL is dead?

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

VB Ordering of Anonymous Type Properties change with VS 2008 SP1

The VS 2008 SP1 includes lots of new features (more than a typical service pack, but that's another matter). There are a number of smaller enhancements that could easily go un-noticed otherwise. One of these is to fix a bug in the way the VB compiler generates anonymous types.

In most cases, you will only notice this if you are binding an anonymous projection to a DataGridView or the ASP GridView. With these controls, you will find that the columns used to be generated alphabetically rather than in the order in which they were declared in the anonymous type projection. Consider the following query (using the Northwind Customers table of course):

Dim query = From cust In dc.Customers _
           Select cust.CompanyName, cust.ContactTitle, cust.ContactName, cust.Address

With VS 2008 RTM, a grid bound to these results would display the columns in the following order unless you explicitly set the columns: Address, CompanyName, ContactName, ContactTitle. Notice here that they are listed alphabetically. Interestingly if you use the same query in C#, the columns would be displayed retaining the order of the projection.

To determine the cause for the difference, one must view the actual IL using ILDASM. Comparing the C# with VB IL, you can see that the C# compiler retains the order where-as VB alphabetized it. No, you can't use Lutz's Reflector this time because he too alphabetizes the properties from the IL.

This was true until VS 2008 SP1 which fixed this bug. Now, the anonymous types are generated in the order in which they were declared and the column ordering is retained. Running the same query we had above results in the following column ordering with VB 2008 SP1: CompanyName, ContactTitle, ContactName, Address.

Thanks to the VB team for fixing this bug in the compiler.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB - VS 2008 -

Filling an object from a DataReader with LINQ using DataContext.Translate

One of the key things that LINQ to SQL does for us is offers a quick way to fill a set of objects with data from a database. Typically this is done by setting up some mapping and calling the GetTable method on the DataContext. There are cases, particularly when you already have an infrastructure set-up to populate objects using a DbDataReader, where it would be nice if you could just populate the columns without the need to set-up a mapping.

The DataContext has a little known method called Translate which can take a DBDataReader as a parameter and fill an IEnumerable list of objects without the need for mapping structures. To use it, specify the type you want to populate and pass the instance of the reader and let it do it's magic. Here's a sample:

Using cn As New SqlClient.SqlConnection(My.Settings.NorthwindConnectionString)
  cn.Open()
  Using cmd As New SqlClient.SqlCommand("SELECT * FROM Region", cn)
    Dim reader = cmd.ExecuteReader()
    Using nwind As New NorthwindDataContext

      Dim RegionList = nwind.Translate(Of Region)(reader)

      For Each item In RegionList
        Console.WriteLine(" {0}: {1}", item.RegionId, item.RegionDescription)
      Next
   
End Using
 
End Using
End Using

In this case, we tell the context to translate the resultset of the reader into a generic IEnumerable(Of Region). There are number of caveats to keep in mind here:

  1. The column name in the result set must correspond to the property name.The translation is quite forgiving however. It is not case sensitive.
  2. If you have a row in the result set but no corresponding property, that row will be ignored (no exception will be thrown).
  3. If you have a property in your object with no corresponding row in the result set, that property's value will not be set.
  4. The translation must be done to a known type. You can not project into an anonymous type. You can return an object or simple IEnumerable if the type is not known and then use reflection to work with the results.
  5. Since the translate method returns an IEnumerable, make sure not to close the connection or the reader before iterating over the results.

The Translate method supports 3 overloads as follows:

    Public FunctionTranslate(Of TResult)(ByVal reader As DbDataReader) As IEnumerable(Of TResult)
    Public Function Translate(ByVal reader As DbDataReader) As IMultipleResults
    Public Function Translate(ByVal elementType As Type, ByVal reader As DbDataReader) As IEnumerable

Translate does not require any mappings to be set. It simply sets the public property values based on the column names. If you have business logic in your Property Set implementations, that logic will run. As far as I can see, there is no way to instruct Translate to use the private storage field instead of the public property.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

Fetching child records using Stored Procedures with LINQ to SQL

You can consume stored procs rather than the standard dynamic sql for accessing child objects. To do this, set up your fetch stored procs and make sure that they return the correct data type (not the standard custom generated type for stored procedures). To load a child collection, create a method on the partial implementation of your context. Name the function "LoadCs" where "C" is the name of the child property accessor from the parent object in the designer. This function will take a type as the parent type as a parameter and return an IEnumerable of the Child type. The names you use must agree with the names of the types and properties in your entities in order for this to work.

      Public Class CustomDataContext

     ‘Load a child collection

     Public Function LoadCs(ByVal parent As T) As IEnumerable(Of C)

         Return Me.LoadCs(parent.ID)

     End Function

  End Class

The process to load a single child is similar. In this case, the function needs to be the singularized version of your entity and the return type will be the actual entity type rather than an IEnumerable as follows:

     ‘Load a single child

     Public Function LoadC(ByVal parent As P) As C

         Return Me.LoadC(parent.CId).SingleOrDefault

     End Function

Using these methods causes the context to lazy load the child objects. The default change tracking implementation will continue to work and if you have replaced the runtime behavior for the CUD operations with stored procedure implementations, they will be used just as if you fetched the objects through LINQ generated dynamic SQL.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

Screen scraping and creating Word documents with LINQ to XML

At TechEd Developers 2008 in Orlando, I had the pleasure of competing in Speaker Idol. In that competition, we had the opportunity to present a topic in 5 minutes. Unfortunately, the topic I choose really needed 10 minutes to cover at the level of detail it needed. Instead of limiting the topic, I decided to go ahead and present it a bit too fast.

If you want to see the video, or see how to use VB 9's XML Literals and LINQ to XML to fetch data from a web page (that must be XHtml compilant), manipulate it and insert it into a Word 2007 file, it is now available on the  Developer Landing page, and the Library page. of the TechEd site. If you prefer, you can jump right to the video in either WMV or MP4 file formats. If you're not familiar with LINQ to XML, go ahead and download the video and just watch it at half speed ;-)

Posted on - Comment
Categories: VB Dev Center - LINQ - VB - Linq to XML -

Anonymous Type property ordering in VB

Many people have noticed when binding an anonymous type to a grid in VB that the order of the properties does not reflect the order that they were specified in the projection (Select) clause. Instead, they appear alphabetized. Consider the following query:

Dim query = From c In Customers _
                    Select c.LastName, c.FirstName, c.BirthDate

If you bind this query to a DataGrid or DataGridView and allow the columns to be generated automatically, the results will be displayed with the following column ordering: BirthDate, FirstName, LastName. Notice that the columns are ordered alphabetically based on the property name.

I just received word that this behavior is changing in the next update to VB. After this release, the property order should be retained from the projection clause, just as it is in C#.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB - VS 2008 -

Projecting into an unmapped property from a LINQ to SQL query

On page 216 of LINQ in Action, I made a comment that unmapped properties in a mapped class cannot be used in a LINQ to SQL projection. This was true with the beta bits, but only partially true with the release bits. To begin, let's consider the Author table we have in the book samples.

The Author class has separate fields for the first and last name. Each of these is mapped to the corresponding fields in the Author table. In the book, we show how you can create a read only property in a partial class (so that it won't get clobbered when we regenerate our classes in the designer). The new property is trivial:

   Partial Public Class Author
        Public ReadOnly Property FormattedName() As String
            Get
                Return Me.FirstName & " " & Me.LastName
            End Get
        End Property
    End Class

Notice here that there are no mapping attributes to this property. In part, that is because there is no corresponding field in the table. As we show in the book, you are free to query the author table and return Author objects. From there, you can display the FormattedName as follows:

           Dim authors = From a In context.Authors _
                          Select a
            For Each a In authors
                Console.WriteLine(a.FormattedName & "; " & a.WebSite)
            Next

This works fine because we are projecting the complete Author type. However, in early builds, we couldn't project the unmapped properties into an anonymous type like this:

            Dim authors = From a In context.Authors _
                          Select a.FormattedName, a.WebSite

If you tried to use this projection, you would get a runtime exception. In the RTM bits, the behavior was modified. Now, if you try to run the above query (sample 8.25 in the book samples for anyone following along). You will see that the query succeeds and the anonymous type is used. So how can they know how to populate the FormattedName when it is not mapped and doesn't exist in the table itself? No, the provider doesn't look inside the property, determine the mapped properties that are used, and fetch them. While that could work in our simple example, many unmapped properties  would use significantly more resources, many of which may not be members of our class, or methods without direct translation in TSQL. If you look at the generated SQL that is issued when the query is consumed, you might be able to figure out what is happening in this case.

SELECT [t0].[ID], [t0].[LastName], [t0].[FirstName], [t0].[WebSite], [t0].[TimeStamp]
FROM [dbo].[Author] AS [t0]

Notice here, our select clause to the database is not optimized to only return the fields we requested. Instead, all of the fields are returned. So what's going on? They discovered in evaluating the Select clause that there were unmapped properties. At that point, they just turned around and populated a full author object. Using this object, the provider turns around and generates the anonymous type from the object rather than directly from the underlying data store directly. It's a bit of smoke and mirrors at this point.

So the question that came up asks if the next printing of the book needs to be adjusted to remove the statement that you can't project an unmapped property. While you can indeed project these properties, you can't use them elsewhere in the query. Thus if you wanted to sort the data based on the unmapped property, the exception would be thrown. Consider the following query./p>

            Dim authors = From a In context.Authors _
                          Order By a.FormattedName _
                          Select  a.FormattedName, a.WebSite

In this case when we try to run it, we get the following error:

"System.NotSupportedException: The member 'LinqInAction.LinqBooks.Common.VB.SampleClasses.Ch8.Author.FormattedName' has no supported translation to SQL."

Because of this, I plan to leave the note in the chapter warning you of using the unmapped property in your query. Unfortunately, I don't have enough space in the book to insert this complete explanation at this time. I hope this explanation helps some of you who are confused at this point.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

Joining composite keys with LINQ

LINQ makes working with data in its various guises easier. By intergating it into the language, we have rich integrated support for working with data. However, there are times where the syntax is slighly different from what you would typically expect with TSQL. Once case where this occurs is when trying to join two data sources that are related by more than one field (also know as a composite key). This differs from standard joins where one table has a primary key and the other table has a foreign key id. Here's a sample table structure for a standard join in Northwind between the Products and Categories:

With Linq, this join could be represented with the following query:

Dim query = _
     From p In dc.Products _
     Join c In dc.Categories _
     On p.CategoryID Equals c.CategoryID _
     Select p.ProductName, c.CategoryName

So far, there's not much new in this query. Here, "p" is the outer variable and "c" is the inner variable of the join. Likewise, p.CategoryID is the outerKeySelector and c.CategoryID is the InnerKeySelector in the Join extension method.

This works fine when we have single values that can be compared easily. However, how can we specify multiple fields for the KeySelectors on the join? In my work with reverse mortgages, we have a situation where the loan amount is based in part on the lending limits set forth by HUD. In their limits, they specify the State and County for each limit. In this case, I need to join those values against the loan property's state and county to come up with the limit amount. Let's consider the following partial table schemas.

In this case, we could join the tables in TSQL with the following query:

SELECT LendingLimits.Amount
FROM LendingLimits
INNER JOIN Property
ON LendingLimits.State=Property.State AND
      LendingLimits.County=Property.County
WHERE PropertyId=@SearchValue  AND
      EffectiveDate = @TargetDate

Unfortunately, the Join extension method does not support the ability to provide the InnerKeySelector/OuterKeySelector as a series of expressions. However, when dealing with objects, we can compare objects to see if they equal each other. Therefore, the solution in this case is to join two anonymous types and compare them against each other. Here's the corresponding LINQ query. Notice the difference in the On clause. If you understand working with objects, this syntax should make sense.

Dim query1 = _
    From l In dc.LendingLimits _
    Join p In dc.Properties _
    On New With {l.County, l.State} Equals _
          New With {p.County, p.State} _
    Where p.PropertyId = SearchValue And _
          l.EffectiveDate = TargetDate _
    Select l.Amount

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

Querying the complete plays of Shakespeare using LINQ to XML

I was working to come up with some creative uses of LINQ to XML for my new talk I'm giving at the Huntsville, AL Code Camp. I figured it would be good to include a sample which queries a large XML document. Remembering that the complete works of Shakespeare were available in XML form, I did a quick search and found a version at http://metalab.unc.edu/bosak/xml/eg/shaks200.zip. This file separates each play out into separate XML files. Since I wanted to find out which parts had the most lines across all plays, I wanted to put them into a single XML file. Rather than doing this manually, I went ahead and whipped up a quick LINQ query to fetch the xml documents and load them up into an array of XElements:

Dim plays = _
    From file In New System.IO.DirectoryInfo("C:\projects\ShakespeareXml").GetFiles() _
   
Where file.Extension.Equals(".xml", StringComparison.CurrentCultureIgnoreCase) _
   
Let doc = XElement.Load(file.FullName) _
   
Select doc

Ok, now that out of the way, I really wanted to load up a single XML file with these resulting nodes. Pretty easy using XML Literals. Just wrap the query with a new root element:

Dim plays = _
  
<Plays>
   
<%= From file In New System.IO.DirectoryInfo("C:\projects\ShakespeareXml").GetFiles() _
   
Where file.Extension.Equals(".xml", String Comparison.CurrentCultureIgnoreCase) _
   
Let doc = XElement.Load(file.FullName) _
   
Select doc %>
  
</Plays>

Easy. Now I have a new XML document containing the complete plays of Shakespeare. Now, what can we do with it... Well, we can get a count of the plays in one line:

Console.WriteLine("Plays found: " & plays.<PLAY>.Count.ToString)

We could have done that without putting it into a new document. We do see that we have 37 plays represented, so we know the first query worked. Now, to count the number of lines (LINE) for each character (SPEAKER). The XML document groups each set of lines into a parent node called SPEECH. This SPEECH node then contains the SPEAKER element and a series of LINE elements. For example, here's the beginning of Juliet's fameous Romeo, Romeo speech:

<SPEECH>
   
<SPEAKER>JULIET</SPEAKER>
   
<LINE>O Romeo, Romeo! wherefore art thou Romeo?</LINE>
   
<LINE>Deny thy father and refuse thy name;</LINE>
   
<LINE>Or, if thou wilt not, be but sworn my love,</LINE>
   
<LINE>And I'll no longer be a Capulet.</LINE>
</SPEECH>

So to achieve the goal of counting our lines by character, we find the descendent nodes of the plays element (plays...<SPEECH>) and group them by the speaker. Then we project out the name of the speaker and the number of lines they have. We don't care about the bit roles, so we'll order the results in descending form based on the number of lines (LineCount). We'll limit the results to the top 50 entries. Here's the resulting query:

Dim mostLines = _
  
From speech In plays...<SPEECH> _
  
Group By key = speech.<SPEAKER>.Value Into Group _
  
Select Speaker = key, _
            LineCount =
Group.<LINE>.Count _
  
Order By LineCount Descending _
  
Take 50

The amazing thing with this process, running all three queries here, including the one which loads the full XML from the various files takes less than a second. I haven't had time to do a full performance test, including memory load, but the initial results are quite impressive!

If you have other creative uses of LINQ to XML, let me know, I'd love to include them in future presentations. Also, if you're in the Huntsville, AL area on 2/23/2008, head on over to the code camp and see the entire presentation in person.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB - Linq to XML -

Managing self referencing tables with LINQ to SQL

Using a single self referencing table is a common database pattern for trees of data. As an example of this concept, we can use the Employee table in Northwind. It has a self referential relationship set up using the ReportsTo field. If we drag this into the LINQ to SQL designer, it will infer the self relation and create the appropriate EntitySet/EntityRef relationship exposing it with the Employees (for subordinates of a given employee) and Employee (for the reference to the boss). The ReportsTo property contains the ID of the related property.

Inspecting the attributes used for Employees, we see that the association is mapped as follows:

<Association(Name:="Employee_Employee", Storage:="_Employees", OtherKey:="ReportsTo")> _
Public Property Employees() As EntitySet(Of Employee)

Similarly, Employee is mapped as follows:

<Association(Name:="Employee_Employee", Storage:="_Employee", ThisKey:="ReportsTo", IsForeignKey:=true)> _
Public Property Employee() As Employee

If we wish to traverse the subordinates of a given employee (boss), we can use the following:

Private Sub Main()
   Dim dc As New NWindDataContext
   Dim StartingEmp = (From emp In dc.Employees Where emp.ReportsTo Is Nothing).FirstOrDefault
   Console.WriteLine("{0}{1} {2}", Space(indentLevel * 5), StartingEmp.FirstName, StartingEmp.LastName)
   DisplaySubordniates(StartingEmp)
End Sub

Private indentLevel As Integer
Private Sub DisplaySubordniates(ByVal emp As Employee)
   If emp.Employees.Count > 0 Then
      indentLevel += 1
      For Each emp In emp.Employees
         Console.WriteLine("{0}{1} {2}", Space(indentLevel * 5), emp.FirstName, emp.LastName)
         DisplaySubordniates(emp)
      Next
      indentLevel -= 1
   End If
End Sub

Likewise, we can walk the supervisors for any given employee using the following:

Private Sub DisplayReportsTo(ByVal emp As Employee)
   If Not emp.Employee Is Nothing Then
     Console.Write("Reports to: {0} {1} ", emp.Employee.FirstName, emp.Employee.LastName)
     DisplayReportsTo(emp.Employee)
   End If
End Sub

Realize that using this implementation does have a cost. Each time we iterate over the Employee or Employees, we lazy load the resulting set. This can cause a rather chatty interaction with the data source. Because we are working with n levels of heirarchies, we can not optimize this interaction using the load options. If we were to try to add a LoadOptions, we get the following run-time exception: "Cycles not allows in LoadOptions.LoadWith type graph."

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

LINQ Migration hints

So, you're thinqing about convering existing code to use LINQ? Here are a couple quick tricks to get you started:

1) Have your regression tests ready. No warrantee is implied as to your success once completing the following.

2) Find any imports to System.Data.SqlClient including global imports and remove them. See what breaks in your application and re-write it using LINQ to SQL instead of ADO.

3) Search for iteration loops (For Each/foreach) or explicit calls to enumerator.MoveNext and see if you can replace them with declarative queries.

As an example of this last point, consider this bit of code from the Personal Web Starter Kit (in the Admin/Photo.aspx.vb file):

Dim d As IO.DirectoryInfo = New IO.DirectoryInfo(Server.MapPath("~/Upload"))
Dim enumerator As System.Collections.IEnumerator = CType(d.GetFiles("*.jpg"), System.Collections.IEnumerable).GetEnumerator
Do While enumerator.MoveNext
  Dim f As IO.FileInfo = CType(enumerator.Current, IO.FileInfo)
  Dim buffer() As Byte = New Byte((f.OpenRead.Length) - 1) {}
  f.OpenRead.Read(buffer, 0, CType(f.OpenRead.Length, Integer))
  PhotoManager.AddPhoto([Convert].ToInt32(Request.QueryString("AlbumID")), f.Name, buffer)
Loop
GridView1.DataBind()

If your source offers an ability to pass an array of items rather than requiring an explicit iteration, you may be able to simplify the process. In this case, I also chose to refactor out the logic inside the loop to further assist the maintainability. Here's the refactored code:

Sub Button1_Click(ByVal sender As Object, ByVal e As ImageClickEventArgs)
  Dim d As IO.DirectoryInfo = New IO.DirectoryInfo(Server.MapPath("~/Upload"))
  PhotoManager.AddPhotoList( _
     From f In New IO.DirectoryInfo(Server.MapPath("~/Upload")).GetFiles("*.jpg") _
     Select PhotoManager.CreatePhoto(CInt(Request.QueryString("AlbumID")), _
                f.Name, FileBytes(f)))
     GridView1.DataBind()
End Sub

Private Function FileBytes(ByVal source As IO.FileInfo) As Byte()
  Dim length As Integer = CInt(source.OpenRead.Length)
  Dim buffer() As Byte = New Byte(length - 1) {}
  source.OpenRead.Read(buffer, 0, length)
  Return buffer
End Function

Is the resulting code better? In this case, it is a toss-up. I'm just trying to show the possibility. It's up to you to test the applicability in your own applications. If you want more migration hints, check out my presentation at http://www.thinqlinq.com/Downloads/LINQMigrationStrategies.zip.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

Personal Web Starter Kit LINQed up

In case anyone is interested, I have put together a sample port of the original Personal Web Starter Kit using LINQ rather than the standard ADO data tier in the PhotoManager.vb class. With this version, we can eliminate all of the stored procedures and rely on LINQ for our entire data access. In this implementation, I intentionally attempted to retain the original method signatures where possible to make migration more seamless. The project site is at http://code.msdn.microsoft.com/LinqPersonalWeb.

This is one of the sample projects I use in my LINQ Migration strategies talks, so if you attended that talk, check out the sample project for some more concrete examples. Because the original example is fairly basic and the tiers are separated out neatly, doing a migration really only requires replacing code in one file: PhotoManager.vb. Let's take a look at a couple of the refactorings we did for this example.

The meat of the original project is a solution to group images into albums and store them in a database. To start, we create a new mapping file by creating a new LINQ to SQL Classes file. Onto this surface, drag the Album and Photo tables to generate the entity classes and associated mappings. With that in place, we can move our attention to the PhotoManager class which abstracts all of the data access.

The PhotoManager class has separate methods to GetPhoto, GetPhotos, AddPhoto, EditPhoto and RemovePhoto. The same is true for albums. Each of these maps to corresponding stored procedures. Let's compare the original implementation of GetPhoto with the LINQ enabled version. In the original, we see familiar code to create a datareader and Fetch a scalar result from the function.

Public Overloads Shared Function GetPhoto(ByVal photoid As Integer, ByVal size As PhotoSize) As Stream
  Using connection As New SqlConnection(ConfigurationManager.ConnectionStrings("Personal").ConnectionString)
    Using command As New SqlCommand("GetPhoto", connection)
      command.CommandType = CommandType.StoredProcedure
      command.Parameters.Add(New SqlParameter("@PhotoID", photoid))
      command.Parameters.Add(New SqlParameter("@Size", CType(size, Integer)))
      Dim Filter As Boolean = Not (HttpContext.Current.User.IsInRole("Friends") Or HttpContext.Current.User.IsInRole("Administrators"))
      command.Parameters.Add(New SqlParameter("@IsPublic", Filter))
      connection.Open()
      Dim result As Object = command.ExecuteScalar
      Try
       
Return New MemoryStream(CType(result, Byte()))
      Catch
       
Return Nothing
     
End Try
   
End Using
 
End Using
End Function

The corresponding Stored Procedure is as follows:

CREATE PROCEDURE GetPhoto
 
@PhotoID int,
  @Size int,
  @IsPublic bit

AS
 
IF @Size = 1
    SELECT TOP 1 [BytesThumb] FROM [Photos] LEFT JOIN [Albums] ON [Albums].[AlbumID] = [Photos].[AlbumID] WHERE [PhotoID] = @PhotoID AND ([Albums].[IsPublic] = @IsPublic OR [Albums].[IsPublic] = 1)
  ELSE IF @Size = 2
    SELECT TOP 1 [BytesPoster] FROM [Photos] LEFT JOIN [Albums] ON [Albums].[AlbumID] = [Photos].[AlbumID] WHERE [PhotoID] = @PhotoID AND ([Albums].[IsPublic] = @IsPublic OR [Albums].[IsPublic] = 1)
  ELSE IF @Size = 3
    SELECT TOP 1 [BytesFull] FROM [Photos] LEFT JOIN [Albums] ON [Albums].[AlbumID] = [Photos].[AlbumID] WHERE [PhotoID] = @PhotoID AND ([Albums].[IsPublic] = @IsPublic OR [Albums].[IsPublic] = 1)
  ELSE IF @Size = 4
    SELECT TOP 1 [BytesOriginal] FROM [Photos] LEFT JOIN [Albums] ON [Albums].[AlbumID] = [Photos].[AlbumID] WHERE [PhotoID] = @PhotoID AND ([Albums].[IsPublic] = @IsPublic OR [Albums].[IsPublic] = 1)
  ELSE
   
SELECT TOP 1 [BytesPoster] FROM [Photos] LEFT JOIN [Albums] ON [Albums].[AlbumID] = [Photos].[AlbumID] WHERE [PhotoID] = @PhotoID AND ([Albums].[IsPublic] = @IsPublic OR [Albums].[IsPublic] = 1)

RETURN

For those unfamiliar with the sample solution, realize that the photos table contains multiple copies of each image rendered in different resolutions. The procedure is responsible for determining which field to return based on the size parameter that is passed in. By limiting the fields returned, we can optimize the IO requirements, but this does mean that any time we want to access the photos, we need to repeat the same logic. Indeed, the same If block used in the GetPhoto procedure is copied and reused in the GetFirstPhoto procedure. This does not lead for the kind of code maintainability we would like to see.

Naturally, since your reading about it here, I'm sure you would like to see how LINQ may offer a better alternative. Starting with the GetPhoto method, we can eliminate the late bound ADO code and provide a more strongly typed version of the same method. We will also be able to refactor and reuse more pieces of the query through-out the application. Here is the code for our GetPhoto method.

Public Overloads Shared Function GetPhoto(ByVal photoid As Integer, ByVal size As PhotoSize) As Stream
  Dim dc As New PersonalDataContext
  Dim query = From p In dc.Photos _
                      Where (p.PhotoID = photoid) And _
                                 (p.Album.IsPublic Or IsFriend)
  Return GetPhotoBytes(query, size)
End Function

Get photo greatly reduces the amount of plumbing code and allows us to focus on the desired results. GetPhoto returns the actual image that corresponds to the requested ID and size. It also checks to see if the user is allowed to see that photo by checking to see if the album is marked as a public album, or if the user is considered a friend based on their login credentials. Here's the implementation of the IsFriend method.

Public Shared Function IsFriend() As Boolean
  
Return (HttpContext.Current.User.IsInRole("Friends") Or _
              HttpContext.Current.User.IsInRole("Administrators"))
End Function

Since the user's credentials are already cached for the current user, there is no need to requery that part of the database on every fetch.  The real key to this implementation lies in the GetPhotoBytes method. In this method, we evaluate the size parameter and dynamically extend our query to project just the field we want to consume.

Private Shared Function GetPhotoBytes(ByVal source As IQueryable(Of Photo), ByVal size As PhotoSize) As Stream
  Dim imageBytes As Byte()
  Select Case size
    Case PhotoSize.Large
      imageBytes = source.Select(Function(p) p.BytesFull).SingleOrDefault
    Case PhotoSize.Original
      imageBytes = source.Select(Function(p) p.BytesOriginal).SingleOrDefault
    Case PhotoSize.Small
      imageBytes = source.Select(Function(p) p.BytesThumb).SingleOrDefault
    Case Else
     
imageBytes = source.Select(Function(p) p.BytesPoster).SingleOrDefault
  End Select
 
If imageBytes IsNot Nothing Then
   
Return New MemoryStream(imageBytes)
  Else
   
Return New MemoryStream()
  End If
End Function

Here we extend the initial query and add custom projection to it. When we issue the query to the database, the resulting SQL statement wraps the functionality we declared in the GetPhoto with the GetPhotoBytes to create a single statement which only returns the image stream that we requested. What's better is that we can now reuse this same GetPhotoBytes method in the GetFirstPhoto implementation, passing a different baseline query.

Public Shared Function GetFirstPhoto(ByVal albumid As Integer, ByVal size As PhotoSize) As Stream
  Dim dc As New PersonalDataContext
  Dim query = From p In dc.Photos _
                      Where p.AlbumID = albumid And (p.Album.IsPublic Or IsFriend()) _
                      Take 1
  Return GetPhotoBytes(query, size)
End Function

There. Nicely refactored and no more copy-paste inheritance in the database.

If you're interested in looking at this implementation further, check out the project site at http://code.msdn.microsoft.com/LinqPersonalWeb. Also, let me know if you would like to see other starter kits migrated to LINQ and I'll see what I can do.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB -

Adding categories to the RSS feed using LINQ to XML

Last time, we added categories to the web view of the ThinqLinq site. This time, we're going to add them to the RSS feed. Because RSS is "Really Simple", adding the categories is fairly easy. According to the RSS specification, <category> is an optional sub-element of <item>. It can additionally contain the domain that contains that category. In our case, we will point the domain to our implementation that displays all posts for a given category by passing the category id to the query string of our default page. Thus we would want to insert a category like follows:

<category domain="http://www.ThinqLinq.com/default.aspx?CategoryId=1">LINQ</category>

To refresh your memory, we created last discussed creating RSS in the following post: http://www.thinqlinq.com/Default/Projecting_XML_from_LINQ_to_SQL.aspx. To add our comments, we need to inject an additional query before inside of each <item> element for the categories. To generate each of the categories for a post, we add a query to project (using the select statement) our new <category> nodes as follows:

     <%= From c In post.CategoryPosts _
              Select <category domain=<%= "Default.aspx?CategoryId" & c.CategoryID %>>
             
<%= c.Category.Title %></category> %>

Notice here, when we are creating the domain, we can use the CategoryID of the CategoryPosts object from the many-to-many table. To display the actual post's title, we need to drill through the CategoryPost to the child Category object's Title property (c.Category.Title). Putting this all together, Using the magic of LINQ to XML and VB, we arrive at the following statement:

Response.Write(<?xml version="1.0" encoding="UTF-8"?>
 
<rss version='2.0' xmlns:dc='http://purl.org/dc/elements/1.1/' xmlns:slash='http://purl.org/rss/1.0/modules/slash/' xmlns:wfw='http://wellformedweb.org/CommentAPI/'>
 
<channel>
   
<title>Thinq Linq</title>
   
<link>http://www.ThinqLinq.com/default.aspx</link>
   
<description>Thoughts relating to LINQ and Language Integrated Query related topics.</description>
   
<dc:language>en-US</dc:language>
   
<generator>LINQ</generator>
   
<%= From post In query.ToArray _
            Select <item>
                        
<title><%= post.Title %></title>
                         <link><%= "http://www.thinqlinq.com/Default.aspx?Postid=" & post.Id.ToString %></link>
                         
<pubDate><%= post.PublicationDate.ToString("ddd, dd MMM yyyy hh:mm:ss GMT") %></pubDate>
                        
<guid isPermaLink="false">42f563c8-34ea-4d01-bfe1-2047c2222a74:<%= post.Id %></guid>
                        
<description><%= post.Description %></description>
                        
<%= From c In post.CategoryPosts _
                                  Select <category domain=<%= "Default.aspx?CategoryId" & c.CategoryID %>>
                                 
<%= c.Category.Title %></category> %>
                        
</item> %>
   
</channel>
  </rss>)

If you want to view the resulting feed, browse directly to http://www.thinqlinq.com/rss.aspx rather than using the standard RSS feed link which is managed through FeedBurner to allow me to gather additional statistics for this site.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB - Linq to XML -

Adding a RSS feed for file downloads

Ok, so I've been a bit busy this weekend adding some nice stuff for this site. One thing that I wanted to add was another RSS feed, this time for the file upload section. If you want to subscribe to the File RSS feed, direct your aggregator to the following link: http://www.thinqlinq.com/Files.aspx/Rss

Of course, since this is a learning site, I'll let you in on the code needed to accomplish the task. As you may guess, LINQ makes serving up XML from an object collection using a heterogeneous join to a database fairly easy. If you haven't seen how we display the files for this site, check out the podcast I did for Wally last year where we go over it in more detail.

To begin the task, we will set-up our initial query We will use two data sources for this task. The first data source is the object collection containing the FilInfo objects from the download directory for this site. (System.IO.DirectoryInfo(...).GetFiles). The second source is the FileClassic table returned using LINQ to SQL. LINQ allows us to easily join these two data sources and work with it as we would any other data source.

Dim fileDescriptions = dc.GetTable(Of FileClassic)()
Dim files = (From f In New System.IO.DirectoryInfo(Server.MapPath("~/Downloads")).GetFiles _
                  Join desc In fileDescriptions On f.Name Equals desc.FileName _
                  Order By f.LastWriteTime Descending _
                  Select URL = "Downloads/" & f.Name, _
                  Name = System.IO.Path.GetFileNameWithoutExtension(f.Name), _
                  f.LastWriteTime, f.Length, _
                  Description = Server.HtmlDecode(desc.Description), desc.Id).ToArray

With our query prepared, we can now use LINQ to XML with VB Literals to generate the RSS feed and output it to the response stream as follows:

Response.Write(<?xml version="1.0" encoding="UTF-8"?>
 
<rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:slash="http://purl.org/rss/1.0/modules/slash/" xmlns:wfw="http://wellformedweb.org/CommentAPI/">
    <channel>
    <title>ThinqLinq samples and presentations</title>
    <link>http://www.ThinqLinq.com/Files.aspx</link>
    <description>Here are some of my presentation files available for you to use as you wish.  Just let me know if you like what you see and what you want to see more of.</description>
   
<%= From f In files _
            Select <item>
                       
<title><%= f.Name %></title>
                       
<link><%= "http://www.ThinqLinq.com/" & f.URL %></link>
                       
<pubDate><%= f.LastWriteTime.ToString("ddd, dd MMM yyyy hh:mm:ss GMT") %></pubDate>
                       
<guid isPermaLink="false"><%= "42f563c8-34ea-4d01-bfe1-2047c2222a74:" & f.Id %></guid>
                       
<description><%= f.Description %></description>
                      
<enclosure url=<%= "http://www.ThinqLinq.com/" & f.URL %>
                                        
length=<%= f.Length %>
                                         type="application/x-zip-compressed"
/>
                     
</item> %>
   
</channel>
 
</rss>)

The main difference between this example and the ones we are using for blog posts is the addition of the <enclosure> node. With enclosure, we specify the url to the file download, the file length, and the file type. In this site, I plan to always use ZIP files so I just hard code that. The other values come from our starting query. If you subscribe to this new feed, you should see each item including an enclosure icon so that you can download it.

Posted on - Comment
Categories: VB Dev Center - LINQ - VB - Linq to XML -

Filtering items in the ThinqLinq aggregation feed

When I first released ThinqLinq, the only filtering I applied was to only select the top 20 posts. I was recently asked if I could extend the implementation so that it the aggregation feed could be filtered based on the categories. Since ThinqLinq uses LINQ for the data interaction, it is relatively easy to add filtering to the existing query.

However, in this case, the filtering is not a simple Where clause on the underlying table. That is because the table structure uses a Many to Many relationship between the Categories and PostItems. Here's the object relationships as created by the LINQ to SQL designer:

Typically when querying a database in a many to many relationship, we start on one end and work our way to the other. LINQ offers another option. Consider the following LINQ Query:

From catPost In dc.CategoryPosts _
Where catPost.Category.Title = Request.QueryString("Category") _
Select catPost.PostItem

In this query, we actually start in the middle. We can do it because we can use object trees rather than having to rely on joins. With LINQ, the many-to-many table contains a reference to the related objects on both sides. This way, we can use one side (the Category) in the Where clause and the other side (PostItem) for the Select clause. Naturally, we could also mix and match the results.

For the rss feed, we need to return the full results if no query string is supplied, but filter it if there is one. In addition, we will replace the spaces with underscores so that we can include them in the query string. With VB's Replace method, we can easy convert back from the underscores to the spaces. Here's the first query that the example will use:

Dim query As IQueryable(Of PostItem)
If Request.QueryString("Category") <> "" Then
   
query = From catPost In dc.CategoryPosts _
           Where catPost.Category.Title = Replace(Request.QueryString("Category"), "_", " ") _
           Select catPost.PostItem
Else
   
query = dc.PostItems
End If

We declare the query as IQueryable(Of PostItem) so that we can further refine the query results later. Regardless of whether we use the filtered or unfiltered version, we still want to limit the results to the most recent 20 items. The composability of IQueryable allows us to further refine our query as follows:

query = From post In query _
            Order By post.PublicationDate Descending _
            Take 20

Notice that we don't include the Select clause as it is optional in VB. Now that we have our target results, we can generate our XML as we did before in this post.

Posted on - Comment
Categories: VB Dev Center - VB - LINQ -

Using LINQ to SQL to return Multiple Results

In the LINQ in Action forum, a user asked about returning multiple result sets from a single stored procedure. Below is one way of dealing with this issue.

In the procedure, we used multiple results rather than a result with a return value (through RETURN or an OUTPUT parameter). Here we need to use the IMultipleResult rather than the default ISingleResult implementation. It appears that the designer does not map IMultipleResult in the final build, so we are going to need to do it ourselves. We mention this interface in chapter 8 but didn't have a chance to include a sample. Here's a sample implementation on returning the Subjects and Books from the Book sample database. First the stored proc:

CREATE PROCEDURE dbo.GetSubjectAndBooks
AS

 Select * from subject
 
 IF @@RowCount>0 BEGIN
  Select * from Book
 END

Now for the function mapping. We want to create a function that can return both the Subjects and the Books. To do this, we will create a function that returns the MultipleResult. Simlar to the standard stored procedure mapping, you create a function in a custom partial for the DataContext. The function will return a value of type IMultipleResults. Decorate the function with the FunctionAttribute including the name of the function. Here's the implementation in VB:

    <FunctionAttribute(Name:="dbo.GetSubjectAndBooks")> _
    <ResultType(GetType(Book))> _
    <ResultType(GetType(Subject))> _
    Public Function GetSubjectAndBooks() As IMultipleResults
        Dim result As IExecuteResult = Me.ExecuteMethodCall(Me, CType(MethodInfo.GetCurrentMethod, MethodInfo))
        Dim results As IMultipleResults = DirectCast(result.ReturnValue, IMultipleResults)
        Return results
    End Function

Notice, the main difference here is the addition of two attributes identifying the possible ResultTypes (Book and Subject). The rest of the function should be self explanitory. To consume the function, we call the GetResult method of IMultipleResults passing in the generic type we want to return as follows:

    Dim context As New LinqBooksDataContext

    ObjectDumper.Write(context.GetSubjectAndBooks.GetResult(Of Subject))
    ObjectDumper.Write(context.GetSubjectAndBooks.GetResult(Of Book))

Posted on - Comment
Categories: VB - LINQ - VB Dev Center -

Creating HTML emails using VB 9 and LINQ

Today, I'm not looking at sending mass spam using LINQ to pull a list of recipients. I'm actually referring to the ability to generate the message body using XML Literals. Using the System.Net.Mail.MailMessage object, we can easily send emails to an SMTP server.

The body of the email can either be plain text or HTML. Dynamically generating the text is often a laborious task involving a string builder and lots of method calls. The body corresponds to the body portion of a HTML page. If you use well formed XHTML in the body, you are actually generating a specialized version of XML. Once we are working with XML, we can use XML Literals in VB to format our output.

I recently had to do this on a project to send lists of updated values from an external source. In the body, I needed to dynamically fill a HTML table with the new values. The table consists of 4 columns: State, County, Limit, Effective Date. I begin by laying out the content in a HTML editor (like Visual Studio 2008...) Here's the results:

<body>Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Curabitur eros purus, suscipit ac, pulvinar vel, aliquet vehicula, pede. Duis eros dolor, iaculis non, aliquam sed, tincidunt ac, diam.
   
<table>
     
<tr><th>State</th>
           
<th>County</th>
           
<th>New Limit</th>
           
<th>Effective Date</th>
     
</tr><tr><td>XX</td>
                     
<td>Foo</td>
                    
<td>$123</td>
                     
<td>1/1/2000</td>
               
</tr>
   
</table>
 
</body>

I know what you must be thinking by now: Gee Jim, how could you come up with such a beautiful page. As Bones would say, "D@mmit Jim, I'm programmer not a designer." We'll keep it clean for now to focus on what is yet to come.

Realize that our body tag is actually the root of a well-formed XML document. As such, we can copy it as a template directly into our tool (which is a console application by the way), add a reference to System.Linq and System.Xml.Linq, and paste it into our VB module assigning a variable, let's call it "body" to the XML.

While we're at it, we'll go ahead and insert new rows into the table based on the results of an object query. In this query, we'll iterate over the records we are adding which is an IEnumerable(Of Limit). We'll project a new row (<tr>) for each object in our iteration. Rather than imperatively iterating, we'll use LINQ's declarative syntax In addition, we'll insert our values using the <%= %> place holders. Here's the resulting declaration:

Friend Shared Sub SendUpdate(ByVal newItems As IEnumerable(Of FhaLimit))
Dim body = _
  <body>Lorem ipsum dolor sit amet, consectetuer adipiscing elit. Curabitur eros purus, suscipit ac, pulvinar vel, aliquet vehicula, pede. Duis eros dolor, iaculis non, aliquam sed, tincidunt ac, diam. 
    <table>
     
<tr><th>State</th>
           
<th>County</th>
           
<th>New Limit</th>
           
<th>Effective Date</th>
     
</tr><%= From limit In newItems _
                      Order By limit.State, limit.CountyName _
                      Select <tr><td><%= limit.State %></td>
                                      
<td><%= limit.CountyName %></td>
                                      
<td><%= limit.Units1.Value.ToString("c0") %></td>
                                      
<td><%= limit.LimitTransactionDate.ToShortDateString %></td>
                               
</tr> %>
   
</table>
 
</body>

If you've coded ASP.NET, the resulting declaration should look very familiar. Realize that this is being done in a VB module in a console application. We are not coding in a .ASPX file! The resulting maintainability of this code is much better than the old way using string builders or XSLT.

To finish off the task, we are going to send the message with our new XHTML body. This is very easy with .NET as well.

Dim message As New System.Net.Mail.MailMessage("from@ThinqLinq.com", "to@ThinqLinq.com", "Limits Updated", body.ToString)
message.IsBodyHtml = True
Dim server As New System.Net.Mail.SmtpClient(My.Settings.SmtpServer)
server.Send(message)

There you go, a quick and painless way to create HTML emails using VB 9 and LINQ. Let me know what you Thinq.

Posted on - Comment
Categories: LINQ - VB - VB Dev Center - Linq to XML -

Code Snippets and Partial Methods

In my VB 9 language enhancements talks, I do them withalmost all coding on the fly as I find people often can comprehend the code. I start by building a quick class that is used throughout the demos. To assist, I do use the snippet functionality in VB. For example, if you type "property" and then tab twice, the designer will generate a private field with public property accessors. The if you change the highlighted values, any associated names will be changed as well.

Private newPropertyValue As String
Public Property NewProperty() As String
    Get
       
Return newPropertyValue
    End Get
    Set(ByVal value As String)
        newPropertyValue = value
    End Set
End Property

Personally, if you are not doing anything within your properties, there isn't much that you buy in using properties as compared to just exposing the field publically. I want a bit more functionality built into my properties. In the very least, I want to be able to include some change tracking. Once nice feature of the snippets is the fact that they are quite easy to modify and create your own.

To begin, we need to find where the supplied snippets are located on your disk. We can find this by clicking "Tools" and then  the "Code Snippets Manager". We can find the location by drilling into the "Code Patterns" then "Properties, Procedures, Events" and find the "Define a Property" snippet. The location window will show you where this one is located. In the default install, it will be in your c:\Program Files\Microsoft Visual Studio 9.0\Vb\Snippets\1033\common code patterns\properties and procedures\ folder. Navigate to this folder and copy the DefineAProperty.snippet. Paste it as a new file and name it whatever you want keeping the .snippet extension.

The snippet file is just a XML document. Open it with visual studio to edit it. The top is a Header which includes the description information that will show up in the snippet manager. One change you will need to make is to alter the "Shortcut" tag so that it will use the key combination you want to use to invoke your custom snippet. In my demos, I use "propertycc", thus I change the header as follows:

<Title>Define a Property with PropertyChanging</Title>
<
Author>Jim Wooley</Author>
<
Description>Defines a Property with a backing field.</Description>
<
Shortcut>Propertycc</Shortcut>

The key is to change the info in the Snippet node. In my case, I like to use a convention where the private field is the same name as the public property with the exception that it is prepended by the underscore. Thus, my field may be called _Foo and the property is called Foo. Due to this, I can eliminate the PrivateVariable node and just keep the PropertyName and PropertyType nodes.

With these changes in place, we can actually define our new generated code. This can be found in the CDATA section in the Code node. I use the following in my snippet declaration:

<![CDATA[Private _$PropertyName$ As $PropertyType$
Public Property $PropertyName$() As $PropertyType$
    Get
        Return _$PropertyName$
    End Get
    Set(ByVal value As $PropertyType$)
        If Not _$PropertyName$.Equals(value) Then
            _$PropertyName$ = value
            OnPropertyChanged("$PropertyName$")
        End If
    End Set
End Property
]]>

With this definition, any time we change the starting PropertyName, all associated values will be changed for each snippet. When we save our changed snippet and open a class libarry, we can start to use our new snippet by typing "propertycc" and the following code will be generated for us:

Private _NewProperty As String
Public Property NewProperty() As String
    Get
       
Return _NewProperty
    End Get
   
Set(ByVal value As String)
        If Not _NewProperty.Equals(value) Then
           
_NewProperty = value
            OnPropertyChanged("NewProperty")
        End If
   
End Set
End Property

Now our property has a detection as values change and we can then do something with it. In this case, we will call an OnPropertyChanged(propertyName as string) method, assuming we have defined one in our class. If we don't have one defined, we won't be able to compile our application. We have several options to provide the OnPropertyChanged method. The class could inherit from a base class implementtation. The additional complexity level may not necessary in many cases. Additionally, we could implement a concrete method in our class. This will mean a slight performance hit if we don't actually do anything in the method.

As an alternative, we can use the new partial methods in VS 2008. The great thing about partial methods is that if they are not implemented they are compiled away. Additionally, we can place the partial stub in a partial class for generated code and then put the implementing method in the other half of a partial class which is isolated to the custom business functionality. With this architecture in mind, we can define our partial method in the class with the rest of our generated code properties:

Partial Private Sub OnPropertyChanged(ByVal propertyName As String)

End Sub

In the other half of our partial class pair of files, we can implement the method as follows:

Private Sub OnPropertyChanged(ByVal propertyName As String)

RaiseEvent PropertyChanged(Me, New System.ComponentModel.PropertyChangedEventArgs(propertyName))

End Sub

Public Event PropertyChanged(ByVal sender As Object, ByVal e As System.ComponentModel.PropertyChangedEventArgs) Implements System.ComponentModel.INotifyPropertyChanged.PropertyChanged

If we don't need the implementation, the OnPropertyChanged method call in the property setters will be compiled away, otherwise we already have the stubs in place with our snippet in order to handle the functionality as necessary.

If you are interested in trying out this snippet, I'm attaching it to this post. Simply unzip it to your snippets directory and try it out.

Posted on - Comment
Categories: VB - VB Dev Center -