Would you do a large complex data conversion one-table at a time, with your systems using one, the other, or both data models concurently for an extended period? Clearly this is asking for trouble and complexity. Best practice for large data conversions is to do them in a way that brings the entire data model to its desired end state within the shortest possible outage. The same reasoning applies to large-scale code conversions. Doing them piecemeal over and extended period of time is asking for trouble in terms of increases labor costs and technical risk.
IMO, a better approach is to formulate the end-state development and architecture standards for your .NET code and then invest in a process that helps you efficiently rewrite your system in a way that conforms to those standards and accurately preserves legacy business rules and functional behavior. Long transitions and complex hybrid/intermediate solutions are only a stop-gap at best cause business problems and project failure at worst -- they should be avoided. A better approach will allow you to deliver the legacy software to the new platform in internally consistent, independent, and well-formed pieces. Furthermore, delivering the migration in fewer, larger pieces will be more efficient and less disruptive than many little pieces.
The key to making this approach viable is to use next generation VB6/COM/ASP to .NET tools that allow you to iteratively calibrate, customize, and verify an automated rewrite process that balances automated conversion with manual work. The tools from Great Migrations are specifically designed to enable this methodology. We call it the "tool-assisted rewrite". We have used this approach on several large migrations projects including upgrading an application portfolio of 1.2M LOC of VB6/COM to re-engineered C#/.NET.
Disclaimer: I work for Great Migrations.