Sunday, September 10, 2017

MKE Dot Net 2017 Review

On Saturday, September 9, I attended the 3rd annual MKE DOT NET conference here in Milwaukee.  This is a one day event organized by Centare, a local technology consultancy and focuses on Microsoft .NET and related technologies.  I was also fortunate enough to be selected as a speaker for the second year in a row.

I said this last year after attending, and I will say it again this year.  This is one of the best organized and executed events that I have ever attended.  The 2016 version of the event was excellent, and yet the MKE DOT NET team found a way to deliver an even better event this year.  I say that as both a speaker and as an attendee.  This is a real first class event, an event that every developer in Wisconsin and the Chicago area should be looking to attend in future years.  You can travel farther and spend more money, but you will not find a better executed event than this one.

For the rest of this post, I'll review various aspects of my experiences in the hopes they are useful to others when looking to attend or speak at this event in future years.

Speaker Experience

The speaker experience at MKE DOT NET is in a word superior.

The communication between the organizers and speakers is excellent.  Speaker selections went out on June 22nd, well ahead of the event.  In addition, two weeks before the event they sent out a speaker guide, answering questions about the A/V capabilities of the room, the schedule for the day and other general information about the event it is good to know beforehand.  They also offered to pick up any out of town speakers at the airport if need be.  This level of communication really makes the speaker's job easier, because a lot of the questions you have about an event, they have already proactively answered.

Like last year, they hosted a speaker dinner on Friday night before the event.  This year it was at Third Space Brewing in Milwaukee.  This year, more speakers were able to attend, and the space was more conducive to being able to socialize with your fellow speakers than last year.  Not to say last year wasn't good, it was very good.  But last year was more of just a dinner where as this year was a dinner and an opportunity to get to know everyone, so this was excellent.

As for the actual speaking part, this was also excellent.  The event was held at the Potawatomi Casino in Milwaukee, which also has a hotel and conference space.  The A/V in the rooms was excellent.  Projectors where built into the room, had a 16:9 aspect ration, were bright and projected onto a large screen built into the room.  In short, the rooms were built for events like this, with high resolution projectors so everyone could easily see.  This is a big win to have high quality A/V like this.  The rooms sat around 75-80 people, so there was plenty of space for popular sessions.  Finally, while I was setting up, two volunteers from the conference came in to check with me that I had everything I needed and everything was working.  Again, a first class experience on the rooms.

As for an opportunity to improve, this year talks where 45 minutes rather than the normal hour.  I would prefer in future years they go back to the hour long format for talks.  As a speaker, you often prepare your talks for an hour because that is what most events allocate.  So I needed to do some trimming on my talk to have it fit in the allotted time, and even then, I still felt a little rushed on the last section.

Attendee Experience

Aside from giving my session, I was able to attend a number of other sessions at the event as well.  Again, I think the attendee experience was superior.

The most important factor for attendee experience comes down to who is speaking and are there interesting sessions to attend at each time slot.  What I like about this event is that there is a mix of local speakers and speakers from out of the area who have more recognizable names.  I think that is an important balance to keep in the future.  It is nice to see some recognizable names to draw people into the event, but I would hate for all of the area speakers to be crowded out of the event is well.  Being able to learn from and network with both is important, and I hope this continues.

I found Joel Karr's talk on not thinking when you right code very thought provoking.  He is right, people have different skill levels with different technologies, and we should consider that when assigning out work on a team.  What this also says is that we want to gain enough practice with with key technologies so we don't have to think when we work on a task -- and by that I mean we understand the technology and problem so well that we can rely on muscle memory.  Further, he talked about that we need to admit when we are in learning mode.  This is an excellent point.  There is nothing wrong with learning mode.  We all have to do it, and we all do it a lot.  But let's admit this to ourselves and know that progress will be slower and it will be hard to really define the work while learning.  And as such, we need to make time for this learning.  These are all very good thoughts to keep in mind.

I then attended the talk "Components of Good UI: An Intro to ReactJS" by Vince Maiuri and Ryan Feil.  During this talk, they live coded a simple ReactJS application, which was good to see because it really helped me understand how the different pieces of a ReactJS app fit together.  I've looked at some tutorials for ReactJS before, but this talk was the clearest explanation I've seen yet about how to get going with ReactJS.  My team is working on an Angular 4 app right now, so I don't have any plans to start working with ReactJS right away, but it is good to understand the different approaches the two frameworks take.

The next session I attended was on Mocking and Unit Testing by John Wright.  John works at Stack Overflow and is clearly a really sharp guy.  I liked the history he gave of mocking libraries and how he introduced the different things you could do with a mocking library.  The biggest takeway for me was the capabilities of some of the unconstrained mocking frameworks.  While these are all pay libraries, they will "rewrite code on the fly" during the JIT process and allow you to test things that otherwise you would not be able to test easily like legacy code.  So something worth keeping in mind.

The final session I attended was Dustin Ewer's session on d3.js.  I've seen Dustin speak a few times before, and he always does a really good job and this time was no exception.  I played around with d3 a couple years ago, so this was a good re-introduction to the capabilities for me.  He also talked about some of the libraries that have now been built on top of d3.js that make consuming the library a little easier.  This is something I wish I had time to play around with more, but is a topic that has to go onto the learning backlog for now.

There were a number of other sessions I wish I could of attended.  I think Jeff Strauss's talk on Open Source software would have been excellent.  So also would the talks on Functional JavaScript by Jonathan Mills and Jane Prusakova's talk on the power of a Nudge.  And I could go on and on, it just speaks to the depth of quality sessions which is the most important attribute of any conference.

Finally, one of the other nice perks of the event is they host a happy hour at the conclusion where everyone gets a drink ticket, a custom commemorative glass and the opportunity to talk with their fellow attendees about the day's happenings.  This is a really nice touch, and a great chance to catch up with others rather than everyone just taking off after the last session ends.  This year, the commemorative glass was a really heavy mug that I am sure will be great for a big mug of Sprecher root beer or even a root beer float.  Just need to get it washed so I can try it out.

Other Thoughts

I really liked the Potawatomi as the location for the event.  As I covered above, the rooms were first class and really designed for events like this.  I thought the food at both breakfast and lunch was very good and there was plenty of space to host an event of this size.  I also like the location because if someone is looking for something to do after the event, there are lots of options.  Obviously there is the casino itself for those who like that sort of thing, but they also have restaurants and live music at the Potawatomi.  And if that isn't your cup of tea, you are a quick 10-12 minute drive from downtown Milwaukee and the Historic Third Ward with all of their restaurants and entertainment options.  So just a really good location.

This year's event had about 320 attendees (excluding speakers), which is up from last year and is very good.  but for as good as this event is, that number should be more, like 500 or so.  I think most of the attendees were Milwaukee based, but there is no reason that developers from Chicago, Madison and Appleton/Green bay shouldn't be able to make the quick drive over as well, especially given the quality of the event.  You are just not going to find a better speaker lineup and a better put on event anywhere in the area.  Tickets ranged from $79 to $119, but that is really cheap considering that included in that were two meals, and after party and a t-shirt.  I am happy with the 320 number, but next year I would like to see what I can do to help get this number up even higher.

One other thought is that the event started at 8:00 AM this year.  Unfortunately, this was also complicated by the Brewer's half marathon blocking off most of the roads to the event, so a lot of people including myself arrived late.  For myself, this meant that I missed the keynote presentation which was disappointing.  However, I wonder if it might be smart to push back the keynote to start at say 8:45 or even 9:00 AM.  If an attendee is going to drive in from Chicago, Madison or the Fox Valley, a little later start time makes things easier, so maybe a simple change like this could boost attendance from those other areas.

Wrap Up

All in all, a great event.  Enough credit cannot go to David Pine, Steven Hicks, Rachel Krause, Amanda Daering and the rest of the Centare team for putting this on.  So take this as a big thank you, for putting on the event, for giving me the opportunity to speak and for giving all of us the opportunity to learn.

Saturday, September 2, 2017

Converting an Existing SQL Server Table to a Temporal Table

One of the very useful features added in SQL Server 2016 were temporal tables.  With a temporal table, SQL Server will automatically record a history of all changed data rows to a history table associated with the temporal table.  Further, SQL Server gives us some new syntax to be able to easily query what the data in the table looked like at any point in time or to show the entire history of a row in a table.  If you want more details, you can check out this earlier blog post I wrote on temporal tables.

However, what if we have an existing table in our database that we want to convert to a temporal table?  Lets take a look at how we do that.

For this example, lets assume that we have the following table that already exists in our database.

    EmployeeId    INT          NOT NULL,
    FirstName     VARCHAR(20)  NOT NULL,
    LastName      VARCHAR(20)  NOT NULL,
    Email         VARCHAR(50)  NOT NULL,
    Phone         VARCHAR(20)  NULL,
    CONSTRAINT PK_Employees
        PRIMARY KEY (EmployeeId)

To convert this table to a temporal table, it is a two step process.  The first step is that we need to add our ValidFrom/ValidTo columns to the table to represent when the row was active in the table.  So we can run the following statement to do this.

        NOT NULL DEFAULT '1900-01-01 00:00:00.000',
        NOT NULL DEFAULT '9999-12-31 23:59:59.999',
    PERIOD FOR SYSTEM_TIME (ValidFrom, ValidTo);

Here we are adding the two columns needed for when the row is valid and the PERIOD that is required by a temporal table.  Some things to note:

  • We could name the columns ValidFrom and ValidTo anything that we want to, these are just the names that I chose.
  • These columns must be of a DATETIME2 data type.  In this case, I am using DATETIME2(3) to go down to millisecond precision.
  • We need to provide default values for these columns in order to populate the existing rows on the table.  For my ValidFrom I chose 1/1/1900 as a default starting date.  The ending date for the rows in ValidTo column must be the maximum date/time value for our data type, so in this case, 12/31/999 at 23:59:99.999.
  • Otherwise, the syntax for the columns look much like the syntax for the columns in the CREATE TABLE statement.
Then, we need to run step 2 of the process:

ALTER TABLE Employees			
    SET (SYSTEM_VERSIONING = ON (HISTORY_TABLE = dbo.EmployeesHistory));

Here, we turn on SYSTEM_VERSIONING for the table so the ValidFrom and ValidTo dates will be auto-generated and define the name of the history table to use.

And that is all there is to it.  Now, your table has been converted to a temporal table and any changes to your table will be tracked in the history table.