If you simply ran our query first and then linq 2 sql, you'll always get the second one to be faster:
1) connection is available in connection pool
2) database has read data from disk into memory
3) query is cached and results are streamed to client without much overhead.
That said, in O/R mapper land there's a simple rule: the most simple O/R mapper is the fastest when it comes to simple straight forward entity fetching. The reason is simple: there's less overhead because the simple o/r mapper is simply copying values from the datareader into some object and that's about it.
LLBLGen Pro's fetch pipeline is optimized to the point where we couldn't find a point to optimize anymore. Linq to sql has some advantages on this as they pre-generate IL to fetch the data (so that's a bit faster) and have less overhead in the entities because the O/R mapper is simple. (we do authorization calls etc. for example, check if dependency injection has to take place. It's all very quick if nothing has to be done but still takes a bit of time and if you fetch a lot of rows, it will add up)
The downside of linq to sql is that their linq provider, albeit solid, is very slow. A complex linq query can take a long time (relatively speaking) to parse, and with llblgen pro that's not the case. Also, as Arschr has pointed out, more complex scenarios will show you a totally different picture:
http://weblogs.asp.net/fbouma/archive/2008/03/07/developing-linq-to-llblgen-pro-part-14.aspx
So 'it depends'. In the end, nothing beats a raw datareader. If you want optimal performance, you have to use a raw datareader as every other thing is slower and making concessions towards performance, you thus end up there, OR accept that features take time, and that the question has to be: "Ok, features take time, is the code optimal and profiled accordingly so the best performance is reached considering these features?" I can only say: Yes. If there would be another bottleneck somewhere, we'd fix it, but I don't know of any other bottleneck, simply because that would mean cutting out a feature and that would mean we'll end up on the slippery slope towards a raw datareader (which we offer you to use btw, if you don't agree with our entity fetch speed, you can use our projection system by fetching a datareader as a projection and use a custom entity projector to create a faster entity projection system if you want
)