BT

New Early adopter or innovator? InfoQ has been working on some new features for you. Learn more

Rico Mariani on Why Visual Studio Isn’t 64-bit

| by Jonathan Allen Follow 54 Followers on Jan 05, 2016. Estimated reading time: 4 minutes |

For a long time now developers have been asking why Visual Studio hasn’t made to switch to 64-bit. Rather than effort or opportunity cost, the primary reason is performance.

This may seem counter-intuitive, but shift from 32-bit to 64-bit isn’t an automatic win. While you can benefit from having access to more CPU registers, that mostly benefits applications that are doing heavy number crunching on large arrays. With an application such as Visual Studio that work with large, complex data structures, the 64-bit pointer overhead dwarfs the benefits of more registers. Rico Mariani of Microsoft explains,

Your pointers will get bigger; your alignment boundaries get bigger; your data is less dense; equivalent code is bigger.  You will fit less useful information into one cache line, code and data, and you will therefore take more cache misses.  Everything, but everything, will suffer.  Your processor's cache did not get bigger.  Even other programs on your system that have nothing to do with the code you’re running will suffer.  And you didn’t need the extra memory anyway.  So you got nothing.  Yay for speed-brakes.

He goes on to say,

Most of Visual Studio does not need and would not benefit from more than 4G of memory.  Any packages that really need that much memory could be built in their own 64-bit process and seamlessly integrated into VS without putting a tax on the rest.   This was possible in VS 2008, maybe sooner.  Dragging all of VS kicking and screaming into the 64-bit world just doesn’t make a lot of sense.

That isn’t to say Visual Studio can’t be improved. But Rico Mariani argues that the solution isn’t to give VS more memory, but rather make it use less.

Now if you have a package that needs >4G of data *and* you also have a data access model that requires a super chatty interface to that data going on at all times, such that say SendMessage for instance isn’t going to do the job for you, then I think maybe rethinking your storage model could provide huge benefits.

In the VS space there are huge offenders.  My favorite to complain about are the language services, which notoriously load huge amounts of data about my whole solution so as to provide Intellisense about a tiny fraction of it.   That doesn’t seem to have changed since 2010.   I used to admonish people in the VS org to think about solutions with say 10k projects (which exist) or 50k files (which exist) and consider how the system was supposed to work in the face of that.  Loading it all into RAM seems not very appropriate to me.  But if you really, no kidding around, have storage that can’t be economized and must be resident then put it in a 64-bit package that’s out of process.

Turning back to the question of more registers, Rico adds,

But as it turns out the extra registers don't help an interactive application like VS very much, it doesn't have a lot of tight compute intensive loops for instance. And also the performance of loads off the stack is so good when hitting the L1 that they may as well be registers -- except the encode length of the instruction is worse. But then the encode length of the 64 bit instructions with the registers is also worse...

So, ya, YMMV [your mileage may vary], but mostly those registers don't help big applications nearly so much as they help computation engines.

A frequent criticism of this stance is the shift from 16-bit to 32-bit applications. Developers in the mid to late 90’s universally hailed that change as beneficial all around. “So why don’t we see the same gains when going to 64-bit?”, is often asked. In a follow up article titled 64-bit Visual Studio -- the "pro 64" argument, he explains the difference.

It was certainly the case that with a big disk and swappable memory sections any program you could write in 32-bit addressing could have been created in 16-bit (especially that crazy x86 segment stuff).  But would you get good code if you did so?  And would you experience extraordinary engineering costs doing so?  Were you basically fighting your hardware most of the time trying to get it to do meaningful stuff?  It was certainly that case that people came up with really cool ways to solve some problems very economically because they had memory pressure and economic motivation to do so.  Those were great inventions.  But at some point it got kind of crazy.  The kind of 16-bit code you had to write to get the job done was just plain ugly.

And here’s where my assumptions break down.  In those cases, it’s *not* the same code.  The 16-bit code was slow ugly [word removed] working around memory limits in horrible ways and the 32-bit code was nice and clean and directly did what it needed to do with a superior algorithm.  Because of this, the observation that the same code runs slower when it’s encoded bigger was irrelevant.  It wasn’t the same code!  And we all know that a superior algorithm that uses more memory can (and often does) outperform an inferior algorithm that’s more economical in terms of memory or code size.

This lesson is applicable for most of the applications we write. If one is writing a computational engine or having to jump through hoops to manually swap memory, then shifting to 64-bit may be beneficial. But most of the time, staying with 32-bit and reducing the amount of memory being consumed will have a much larger impact for both the application and the operating systems as a whole.

Rate this Article

Adoption Stage
Style

Hello stranger!

You need to Register an InfoQ account or or login to post comments. But there's so much more behind being registered.

Get the most out of the InfoQ experience.

Tell us what you think

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

or... by Mark N

to reverse a old commercial's tag line "I'd rather switch than fight". Do yourself a favor and look beyond the MS world.

Re: or... by Jonathan Allen

I'm curious as to what problems not having a 64-bit version of Visual Studio is causing you.

Personally I've never had a problem that would have been solved by that, even when debugging 64-bit applications. Most of my issues revolve around the poor support for SQL Server Data Tools.

What does being 64 bit mean precisely? by Henri de Feraudy

For a casual reader there is a little problem what the issue is here.
What does it mean that Visual Studio is not 64 bit?
* That the compiler itself is not a 64 bit application?
*That the compiler cannot generate a 64 bit application?
*both...
Could someone please make this explicit?

Re: or... by Daniel Koinzer

Quite easy. Once the memory is used up (around 2GB) the IDE hangs in garbage collections every other second and developing gets near impossible. So this forces to restart the IDE in regular intervals. This especially happens during heavy refactoring sessions. 64bit would completely solve it as I have 32GB RAM in my system and not 4GB.

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

Allowed html: a,b,br,blockquote,i,li,pre,u,ul,p

Email me replies to any of my messages in this thread

4 Discuss

Login to InfoQ to interact with what matters most to you.


Recover your password...

Follow

Follow your favorite topics and editors

Quick overview of most important highlights in the industry and on the site.

Like

More signal, less noise

Build your own feed by choosing topics you want to read about and editors you want to hear from.

Notifications

Stay up-to-date

Set up your notifications and don't miss out on content that matters to you

BT