Home
  Latest posts
  My Writings
  My Code
  My Gallery
  About me
 
  rssfeed Syndication
 
Bloggtoppen.se
 
 
Links
  Cornerstone
  SweNug
 
Post categories
  misc (48)
  Architecture (21)
  C# (19)
  Asp.Net (2)
  Vb.Net (2)
  Training (7)
  Data (19)
  Events (40)
  Platform (2)
  Orcas (4)
  Updates (3)
  Methods (10)
  Tools (6)
  Announcements (14)
  Languages (1)
  Patterns (6)
  Opinions (11)
  Fun (3)
  Ineta (1)
  Opinion (0)
  Practices (2)
  WCF (5)
 
 
 
Latest posts
Breaking the silence when things has changed .... Saturday, August 22, 2009

And thus the silence is broken.

The last year has been very stormy for me. It has been sharp turns and bright changes. I no longer hold a positions as an instructor at the training company Cornerstone but have moved my daily work to Sogeti (a Cap Gemini company) where I now worke as an architect consultant.

This is a very positive change and I'm very happy.

With this change I'm also moving my blog to another platform (this I built myself when I learned NHibernate 4-5 years or so ago). So anyone interested in my new entries, move your feeds subscriptions to  and your browser favorite to http://blog.lowendahl.net

This is where I will put up my new posts. I have yet to decide what to do with my old posts, they will survive in some form.

Well head over, I'm going there now adding a post, see you there.

Leave a comment Comments (12)
 
Attendee video from PDC Saturday, November 15, 2008

Sondre just put up a really cool video from the PDC: awesome ;)

Leave a comment Comments (11)
 
In defense of the data programmability team Saturday, November 01, 2008

During the PDC the following statement came out of the data programmability team:

"We're making significant investments in the Entity Framework such that as of .NET 4.0 the Entity Framework will be our recommended data access solution for LINQ to relational scenarios.  We are listening to customers regarding LINQ to SQL and will continue to evolve the product based on feedback we receive from the community as well."

Apparently a lot of people in the community interpreted this as the obituary of LINQ to SQL and there has been a lot of harsh words, some comments even suggested that the team ha deceived it's users.

I would like to put in a comment in this debate. First of all, LINQ to SQL wasn't a product of the data programmability team, it was actually put out by another team and gradually incorporated into the data team. At the time the data programmability team was building it's own technology so they didn't really ask for this dual products thing.

Secondly, the comment above does not tell me that they are discontinuing support for LINQ to SQL any time soon, what they are saying is that the efforts will be in the Entity Framework, which is their product from the start, LINQ to SQL isn't.

Third, after attending the EF futures talk at PDC and had a little chat with Daniel Simmons in the expo I interpret the above statement very different than the loud mouths that do what they can to discredit the whole team, my interpretation is that it's the teams intention to continue support L2S but since EF is the preferred product we'll more likely see a merge where the lightweight of L2S will be available in EF than a complete discontinue.

So in conclusion, chill, I'm certain that the team wouldn't do the mistake of killing a product that so many are using any time soon.  And seriously to you who betted large on L2S: From day on the announcement of EF it has always been said that L2S is target at RAD applications and that EF will be the preferred enterprise solution. So why are you even surprised?

--

The post: http://blogs.msdn.com/adonet/archive/2008/10/29/update-on-linq-to-sql-and-linq-to-entities-roadmap.aspx

Response from the team to the critics: http://blogs.msdn.com/adonet/archive/2008/10/31/clarifying-the-message-on-l2s-futures.aspx

Leave a comment Comments (0)
 
Article published at dotnetslackers.com Saturday, November 01, 2008

I just an article published over at dotnetslackers.com. Check it out: http://dotnetslackers.com/articles/ado_net/Modeling-domains-with-The-Entity-Data-Model.aspx

Leave a comment Comments (1)
 
Live coverage from the PDC (In Swedish) Sunday, October 26, 2008

We are a couple of guys at the PDC microblogging with new Sony Ericsson Xperia (X1) in Swedish over here: http://blog.pellesoft.se the content has already started to flow in so don't miss out.

I'll be doing a couple of longer posts here in english as well.

Leave a comment Comments (0)
 
How to introduce new concepts to your team Friday, October 24, 2008

Jimmy just wrote a post on one of the "problems" with DDD to quote him:

"Even though DDD in itself isn't at all new, it's new to those that haven't tried it before, of course. And if you work in a big team with no prior experience of DDD, then it's very risky using it for a new and time constrained project. There are ways of reducing the risk, but it's still there and it's real."

It's a really interesting point of view and one I as a coach and instructor has runned into a lot of times and I think it ties into my earlier post about features being to complex for some developers.

Any new methodology, practice or technology is bound to be difficult in the beginning and as one will very quickly realize is that just because it works for the other teams product or the cool speaker on stage it might not work for me. These failures is more often then not a result of a cargo cult thinking where a team picks up a set of skills from someone else and just tries to copy the success.

It will almost always fail and that is what brings in the risk in the project. When you want to introduce something new in a live project you need experience, at least some experience mixed in with the "new guys".

There is several ways to get that experience into the project, you could hire someone like Cornerstone's coaches, Jimmy or other factor10 consultants to hold your hand through your first baby steps or you can make sure that your team has time to practice new skills (or combine both).

To practice I like to use Code Katas which let's me practice my skills over and over again on the exact same scenario. Allowing me to refine certain aspects of the piece of technology or methodology I'm pursuing.

Doing these practices will give you a more thorough understanding of what the "other guys" are doing and that understanding will eliminate some of the risk, reduce the overhead in real projects to introduce new things and eliminate the cargo cult thinking since now you can take informed decisions from your own past experience.

Oh, and the added benefit of doing these practices together in your team is that all will share a common viewpoint on how code should be written and competence transfer during one of these coding dojos is extremely high.

So, as any other thing in life; take time to practice a lot, practice alone and practice in group before you try the "real deal" and the risks will be minimized.

Leave a comment Comments (5)
 
Code smells are always code smells ... Tuesday, October 21, 2008

... no matter what fancy technology you use.

A question that arise often is when and where to use anonymous methods (or lambdas) where a delegate is expected. Anonymous methods and lambdas is a really nice way to quickly pass in a piece of code where a function pointer is expected.

Quick but sometimes very, very dirty.

To avoid opacity you need to make sure that the anonymous method is clear in it's intent and is easy to understand and read. That's why they are great for one-liners (maybe two or three) but start to get messy when you add more. So a thumb of rule is, if the method intention isn't clear, apply "extract method" and name the method to clearly show intent instead.


To avoid needless repetition you apply the same refactoring. E.G if you discover that a piece of code is re-used over and over, just refactor it to a method with a name that's clear about it's intention.

Also, make sure it's forbidden to copy code and that if anyone want's to reuse some code that's already written they should extract it to a method. The "no copy paste" rule often makes it easier to not fall into "needless repetition" even for anonymous methods.

Leave a comment Comments (8)
 
Swedish dinner party @ PDC Thursday, October 16, 2008

Microsoft has put together a nice little Dinner for all the Swedish people coming to PDC.

Read more and register here: http://blogs.msdn.com/johanl/archive/2008/10/15/tr-ffa-andra-svenskar-p-pdc-2008-i-los-angeles.aspx

Leave a comment Comments (1)
 
LINQ is Just Another Query Language! Monday, October 13, 2008

Just a quick reminder to all the folks out there. Even though LINQ is really slick, it's still a query language and belongs in the same places as other types of query languages do. That is, if you query a database the query should be encapsulated in some kind of DAL component, like the repository, it should not in the UI and almost never in the service layer (small solutions excluded).

Creating LINQ queries in other places suffer from the exact same "shotgun surgery" (sprawl solution) code smell as putting T-SQL queries there.

So in short, refrain from moving queries out of the DAL even LINQ queries.

Leave a comment Comments (7)
 
SQL Summit 2008: The Compression Session Thursday, October 09, 2008

PICT0276At TechEd (https://lowendahl.net/showShout.aspx?id=169) last year this feature got a massive round of applauds. The ability to compress backups and data- and log files. Kalen gave a great talk on this today where she show cased the benefits you get from the new compression features.

Since large databases are expensive in terms of storage cost, performance costs and maintainability costs. Microsoft decided to help DBA's out and ensure that the database can be as small as possible. For the best advantage both for data in memory and on disk, they decided to look at the row storage in the page. The trade off for this limitation in size will be CPU cycles since compression / decompression will add load to the processors.

There is three possibilities in compression:

Row compression

- Compresses data in the columns and makes every row use less space. The compression algorithm used can't take advantage of repeating values in the whole page. This is an extension to the "vardecimal datatype" that was introduced to SQL Server 2005 / SP2 where it tried to store decimals with as few bytes as possible. Row compression will give you a reasonable compression with a small overhead for the CPU.

Page compression

- Compresses whole page and can use a compression algorithm that utilizes the fact that data can be repeated over the whole page. Page also includes row compression. Page compression will give you the best compression with a high CPU cost.

Backup compression

- Compresses the backup file which leads to less disk I/O and smaller backup files.

 

One really interesting option for the compression is the ability to have different compression for different data partitions, which makes it possible to compress data based on rules. Maybe you got a year's worth of data where only the last couple of months are frequently accessed. To get the best performance / compression you can partion that data and apply row compression while the older data will get the full page compression. I can see how this will be very important in data warehouse scenarios.

Speaking about data warehouses; Kalen stressed that the design focus for the compression feature was for large data warehouse installations where we have to handle a vast amount of data and therefor the trade off with CPU power is acceptable, for scenarios where CPU is more important, just stay off compression.

Kalen digged deep into how the compression works, being an application developer with limited knowledge of how pages are layed out, I did get that it was effective, but the exact details eluded me. But that's fine, I'm not going to switch career and start hunting bytes in data pages of SQL Server.

Over all, this session gave a great insight in the problems that compression is set out to solve and how SQL Server actually does the compression.

Thank you Kalen :)

--

Sql Summit 2008 Event site:

http://www.expertzone.se/sql2k8/

More on the compression features:

http://blogs.msdn.com/sqlserverstorageengine/archive/2007/11/12/types-of-data-compression-in-sql-server-2008.aspx

Kalen's blog:

http://sqlblog.com/blogs/kalen_delaney/

Leave a comment Comments (0)
 
SQL Summit 2008: I Am the Governor Thursday, October 09, 2008

PICT0269After Kalen's excellent keynote our SQL Server MVP Tibor Karaszi entered the stage to dive into the new Resource Governor, one of the features that are there to strengthen the Maintainability aspect of SQL Server.

The resource governor is added to the Enterprise version of SQL Server 2008 and gives a DBA the ability to slice and limit resources for jobs running in the database server. Tibor did a "simple" demo were he created two logins, one for marketing and one for developers and then he divided the resources based on that login.

That is how the RG does it's magic. By classifying an incoming connection it assigns that connection to a "Workload Group" which in turn is associated with a "Resource Pool". When this association is the resource pools can be configured to allow max or min amount of a CPU's time or of the installed memory.

Sadly only CPU and memory can be managed in this way, there is no support for dividing I/O time but Tibor showed a glimpse of hope where he delivered the Microsoft mantra "next version".

Apparently there is no way to read the resource usage afterwards, thus making it really hard to do things like debit a client for the resources they use. It is possible to extend the RG with "Extended Events" to build your own home-grown debit functionality but that could be a lot of work.

One of the things making it hard to debit is that the RG only really kicks in when the resources are maxed out.

A really interesting session and I can see how this feature will enable more companies to host SQL Server in the cloud for their clients and multi-tenant SQL Server installations will be better managed.

--

Sql Summit 2008 Event site:

http://www.expertzone.se/sql2k8/

More information about the Resource Governor

http://www.sql-server-performance.com/articles/per/Resource_Governor_in_SQL_Server_2008_p1.aspx

A Demo-Webcast of the Resource Governor by Tibor (In Swedish):

http://www.cornerstone.se/sv/Roller/IT-tekniker/SQL-Server/Demo-Tv-SQL-Server/Demo-TV-Resource-Governor-SQL-2008/

Tibor's blog:

http://sqlblog.com/blogs/tibor_karaszi/

Leave a comment Comments (0)
 
SQL Summit 2008: Key Note Thursday, October 09, 2008

SQL Summit has been touring Sweden this week and today we've finally got to the grand finale at "China Teatern" in Stockholm. With 600+ attendees in all four cities the tour has been a blast with a lot of interesting discussions.

This morning I had the good fortune to listen to Kalen Delaney giving us a history lesson on SQL Server from the early Sybase venture that began 1984 to where SQL Server 2008 is today and beyond.

There was a lot of information to grasp, but one of the main points was that every release has been aimed towards improving on of the M.A.R.S aspects. M.A.R.S stands for Maintainability, Availability, Reliability and Scalability.

Kalen argues that every release have worked on all four but focused more on a particular one and for 2008 it has been the Maintainability that got that extra love. With the resource governor and policy based  management SQL Server gets extremely easy to manage.

Here's some pictures from the keynote:

PICT0251 PICT0263

--

Sql Summit 2008 Event site: http://www.expertzone.se/sql2k8/

Leave a comment Comments (0)
 
"That feature is to complex for my developers..." Tuesday, September 30, 2008

One of the most adverse argument to not explore new technology, methodology or principles is the false alibi of "complexity makes it harder to staff our project".

This is an overused argument in many cases, a solid one in some. Bot the solid and overused cases has one thing in common, clients listens to the argument with full attention. You don't want an understaffed project.But that makes the argument so much more dangerous and one that you should watch out for.

I for one would like to argue that it's more often used as an alibi for developers that don't want to improve their skills, that don't want to learn new stuff and don't want the old ways to change.

If a piece of technology, methodology or principle gives you a benefit. Use it and educate your team, don't stay away from it. In my book, not evolving the team is the same thing as devolving them.

Leave a comment Comments (2)
 
Should we let object oriented programming Rest In Peace? Thursday, September 25, 2008

This is not an obituary for object oriented programming, nor do I think that I have to write one any time soon. This is a post to rally the fanatics, the converted and the blessed to do something about the state of OOP in our community.

At Developer Summit last year Joakim Sundén talked about the ethics in our line of business, or rather the lack there of, and how there is a vast amount of developers that just don't care about the quality of their code. With arguments like "I don't have time to do OOP" or "OOP looks good on paper but in real life it won't work" is not uncommon and are  tightly coupled with solutions built on transaction scripts and huge methods (and often in companion with the same arguments coupled with other trade skills like documentation, tests and basic software engineer practices).

My feeling is that the "I don't have time to do OOP" really means "I don't have time to learn OOP" and this is even more serious. The moment we as developers stop perfecting our skills, that's the moment where the industry is in danger and the moment where the industry starts accepting bad quality, expensive maintenance and code that's disconnected from the business. Should we take this? Should we let the laziness of some developers set a mark of shame for the whole trade?

I for sure won't!

It's time to rally resistance against these lazy developers, rally against their unwillingness to perfect their skill in the trade, rally against ludicrous arguments that in some circles are treated as, and sold to clients, as truths.

I call upon the developers that think that the trade is worth a better faith to start to meet this cancer wherever it shows it's ugly face. It's to easy to sigh and say "they just don't get it and I don't have the time nor energy to make them". Stop, make your voices heard, take your responsibility for the over all view of our trade.

And Remember:

"All that is required for the triumph of evil, is for good people to remain silent and do nothing"

Now go out and evangelize, strike down wherever evil surfaces and make damn sure that ethics, long-term quality and the willingness to perfect ones skills will be the ruling mentality in our line of trade.

Do not stay silent!

Leave a comment Comments (5)
 
Behavior driven development, BDD, hype or a paradigm shift? Wednesday, September 24, 2008

I've had the good fortune to listen to Dan North a couple of times, formally and informally, when he talks about his passion, BDD. Even though I've heard him many times explain the concept, read numerous posts and played around with it, I can't say that I grasp the full story yet. But some things I do get.

First of all, there is a diversity in the BDD community on what it really means. In one corner there is the initial ideas from Dan and in the other corner there is variations of the concepts like SpecUnit from Scott Bellware.

But the basic idea is the same: Wording is important.

Dan derives his work from one of his greatest fascinations, Neuro-lingustic  programming (NLP) where words are used in a communication to trigger responses and effects from a party. His fascination with NLP in combination with having to explain TDD over and over again made him think about the wording and made him think about how words are important when building test-first kind of applications.

One of the words that is important is "Test" as in "Test Driven Development" where people often associate testing as something that is done after development. This is one of the controversies when it comes to selling TDD to the masses, "How can I write my test when I don't have something TO test".

Changing the name from "test" to "specification" or "behavior" will probably level the field.

Everyone specifies behavior of what they want before they try to build it (well almost everyone) so SDD, specification driven development, or as it's called BDD, will make more sense to people and is something they can relate other activities to.

This is where I think that the TDD community would do themselves a big favor if they started talking more about "specifying what the completed code should do", rather then "writing tests for the completed code before completing the code".

The BDD community talks a lot about wording in other senses as well. Dan for instance wants developers to make sure that the tests are complete sentences and sentences that matches requirements from the business. For instance:

Account_Should_Be_debited_Given_The_Account_Is_In_Credit_When_Customer_Requires_Cash

Is a name inspired by his "Given, When, Then" template for behavior specification.

To support this kind of naming scheme, the names tend to get long, projects has started, like the mentioned SpecUnit, RBehave, JBehave and NBehave. This is what the community is thinking about now. To support this new style of development we need new tools, the old once support TDD and not BDD and how should those tools look like?

So is BDD a hype or a paradigm shift? I believe it's the latter, I think that what the BDD and various spec* communities has started is an inevitable change of the "test-first" kind of coding. We see some small ripples now in how people design their tests but give it time and we'll see big changes in tooling and methodology. 

 

--

NLP: http://en.wikipedia.org/wiki/Neuro-linguistic_programming

Merlin's magic spells in modern projects: https://lowendahl.net/showShout.aspx?id=200

Dan's "Introducing BDD": http://dannorth.net/introducing-bdd

Scott on BDD: http://www.code-magazine.com/Article.aspx?quickid=0805061

JBehave: http://jbehave.org/

NBehace: http://nbehave.org/

Rbehave: http://dannorth.net/2007/06/introducing-rbehave

Leave a comment Comments (0)