You are here:

C++/Large Multidemensional Arrays


I would like to create a large two-dimensional array of doubles, but every time i create one that is like this size array[200][200], the program crashes. I've tried using dynamic memory allocation, but the same thing happens. I need large 2d arrays for writing programs for cfd(computational fluid dynamics) problems witch require large grids.
Thanks for any advice,

There is no problem with doing this in theory other than if you define such a large local automatic array then the stack usage will be large.

On modern 32-bit or greater systems however this should not be a problem. Assuming doubles to be 8-bytes (64-bits) in size then 8*200*200 gives 320,000 bytes used. This assumes you are using a compiler that generates code for 32-bit (or larger) systems.

One reason the program may crash in the dynamic allocation case is that the program did not get the memory but did not check and assumed it had it anyway and started accessing memory though a pointer to some random memory location or through a null pointer. Again although you require quite a large chunk of memory I would expect it not to be a problem on a modern 32-bit system with plenty of RAM and virtual memory space, unless you have other memory hogging applications running or your program is using many, many similarly sized chunks of memory. One way this might occur is if you keep allocating the memory but never delete or free it. If you allocate using new [] then release the memory when no longer required with delete []. If you allocate with malloc then release using free.

This size of array will be a problem on systems which have limited stack space, or limited memory for that matter, or for compilers that target such systems. For example if you are using a 16-bit compiler for MS-DOS or Windows 3.1 it is likely that you will be restricted to 64KB or less for all stack space used by your program.

A similar problem exists for dynamic allocation of memory in 16-bit Intel x86 based compilations. In general you can dynamically allocate more than 64KB in total, providing no single block exceeds this amount (in fact it will probably be slightly less that the full 65536 bytes). This assumes you compile using a memory model that supports multiple 64KB data segments, such as the large memory model, and that the system has enough the real or virtual memory available. You may find your 16-bit operating system or compiler runtime library provides additional support for allocating larger blocks such as the 16-bit Microsoft C library function halloc (for huge alloc).

If you are using such a compiler on a 32-bit operating system then you should upgrade to a 32-bit compiler. There are several free choices available. See, specifically for starters. The Microsoft Visual C++ 2005 Express edition and Bloodshed Dev-C++ and MinGW are popular choices for 32-bit Windows platforms. The GNU C++ compiler is always popular on Linux and UNIX systems, and runs on various operating systems using various memory widths (16, 32, 64 bit).

I cannot be more specific as you do not give details of the compiler or system you are using, nor do you mention what reason was given for the program crashing - stack overflow, out of memory, access violation etc.


Oh, I should mention that you can also eat up stack space by allocating the 200 by 200 array of doubles on the stack frame for multiple function calls. Each active function call will add at least 320KB to the stack usage. This can occur for example:

- Because such an array is used by multiple functions which call each other.

- By recursively calling a function that allocates such an array on the stack.

- By passing the array from function to function by value, which is difficult as C/C++ pass built in arrays by reference (i.e. pointer), but may occur if you used some class to represent the array that had copy semantics, such as the a std::vector<double> (although this of course is only 1D).


All Answers

Answers by Expert:

Ask Experts


Ralph McArdell


I am a software developer with more than 15 years C++ experience and over 25 years experience developing a wide variety of applications for Windows NT/2000/XP, UNIX, Linux and other platforms. I can help with basic to advanced C++, C (although I do not write just-C much if at all these days so maybe ask in the C section about purely C matters), software development and many platform specific and system development problems.


My career started in the mid 1980s working as a batch process operator for the now defunct Inner London Education Authority, working on Prime mini computers. I then moved into the role of Programmer / Analyst, also on the Primes, then into technical support and finally into the micro computing section, using a variety of 16 and 8 bit machines. Following the demise of the ILEA I worked for a small company, now gone, called Hodos. I worked on a part task train simulator using C and the Intel DVI (Digital Video Interactive) - the hardware based predecessor to Indeo. Other projects included a CGI based train simulator (different goals to the first), and various other projects in C and Visual Basic (er, version 1 that is). When Hodos went into receivership I went freelance and finally managed to start working in C++. I initially had contracts working on train simulators (surprise) and multimedia - I worked on many of the Dorling Kindersley CD-ROM titles and wrote the screensaver games for the Wallace and Gromit Cracking Animator CD. My more recent contracts have been more traditionally IT based, working predominately in C++ on MS Windows NT, 2000. XP, Linux and UN*X. These projects have had wide ranging additional skill sets including system analysis and design, databases and SQL in various guises, C#, client server and remoting, cross porting applications between platforms and various client development processes. I have an interest in the development of the C++ core language and libraries and try to keep up with at least some of the papers on the ISO C++ Standard Committee site at


©2016 All rights reserved.