5 years ago, it would have been bad practise. However today more and more developers get convinced that if you need a program that will never reach the 100 user limit, you don't have to program it as if it will be used by hundreds of thousands of users needing 10 servers to get it going.
So your intranet application will be used by 20 users (let's make it a 100, your company grows a little ) but that's not shocking. Even if every user has 2MB of data in their session object, that's still 200MB of memory (in the 100 user limit) which is highly unlikely. Also, if you program clean, you can define 'per page' buckets in the session, for example on a non-postback page-load you clear this bucket, and on a postback you don't but re-use the contents. This way every page-change will clear the temp data of another page, you won't be carrying around a lot of data in the session object.
Overwriting data of A altered by user B is common. I wrote an article about this once:
http://weblogs.asp.net/fbouma/archive/2003/05/24/7499.aspx
In other words: the only way it really works is if you merge data, any other method will throw away work, unless you lock the way to edit the data in the first place if its already edited by someone else. This then shows a bottleneck in the organisational structure of the work, so work can be done more efficiently if it's rescheduled.
So I'd go ahead with what you've planned.