You are here:

C++/about bitset


first of all thank you for answering my prevoius questions and helping me

i would like to know
1)  what is the difference between
#include<bitset> and #include<bitset.h>
if i use #include<bitset.h> it gives error why ?
please tell me why u hav used #include<iostream> why not #include<iostream.h>

2)what is the difference between array and bitset
3)while sending bitset data to another computer should i store the bitset data in another variable ?

i m using vc++ 6

1/ bitset is the name of the header file you have to include to use std::bitset. bitset.h is nothing to do with standard C++, so probably does not exist, or refers to a pre-standard implementation from the Standard Template Library from Hewlett Packard I forget the details.

Similar for <iostream> - <iostream> is the standard C++ header. In this case however there existed a pre-standard version as part of the pre-standard IOStreams library (sometimes called traditional IOStreams) and this was called <iostream.h>.

All C++ standard library headers have no extension. Those that contain declarations and definitions from the C library have the prefix c (for example, the C library header <stdio.h> is <cstdio> in standard C++).

For backward compatibility some implementations, especially older implementations such as VC++ 6, include both pre-standard and standard header file names. Those that are also C compilers of course have both the C header and C++ headers.

Another difference between pre-standard and standard implementations of C++ library features is that wherever possible standard C++ library features live in the namespace std. As namespaces are a fairly new feature, old versions of C++ library features do not in general place named C++ library items in a namespace.

2/ Eh? An array is a built in feature, unless you mean some class you have available to you. A bitset is a class template that is part of the C++ standard library and is designed to make it fairly easy to work with sets of bits of any (fixed) size. It is probably implemented as an array of larger storage units (on most modern architectures this must be at least an array of bytes 8-bit unsigned char most likely - as this is the smallest addressable unit on such processors). std::bitset is a convenience, as are all the library features. You could spend time writing the operations it provides yourself. However if it does things you need efficiently enough why waste time re-inventing the wheel?

Bitset has to be slightly different from say a std::vector<bool> in that bool is often represented as a whole byte (char) as a compromise between space and efficiency. Bit operations within machine words can be much slower than reading or setting the value at an address. For example the question: is this byte 1? can be answered quicker than the question: is this bit of this byte 1?

3/ Yes. There is no requirement that the implementation of std::bitset on your machine with your operating system and your compiler and standard C++ library is the same or compatible with mine. However you can input and output a bitset from/to a stream. This, as I showed before, works with the bits shown as a string of 1' and 0' characters. You could therefore locate a socket stream implementation, or you could just use the construct from string constructor and to_string member function.

It is probably better, at least at first, to send as a string of 0' and 1' characters rather than as a multi-byte value (10 bits requires at least 2 bytes to send). The reason is that different processors define the bytes to be arranged in different orders to make longer word sizes. This is known as the endianness of the processor. Popular processors are generally either big endian in which the most significant byte of the word is stored at the lowest memory address, or little endian in which the least significant byte of the word is stored at the lowest memory address. Intel x86 processors are little endian. Sun SPARC, Motorola 68K and the PowerPC family of processors are big endian.

Alternatively you can look at using the definitions of the htons and ntohs (host to network short and network to host short) macros to convert the values from host to network ordering and back again at the other end. Note that these are non-standard C++. They are part of the socket implementation, which under Windows is called Windows Sockets or winsock, current version is 2. You need to include Winsock2.h and link with Ws2_32.lib and uses Ws2_32.dll this is what is says in the documentation however I would be surprised if the libraries are in fact needed just to use these macros. On the other hand you will most likely be using sockets somewhere down the line at which point the libraries would be needed.

In this case you would use the std::bitset::to_ulong member function to get the bits as an unsigned long, cast it to an unsigned short (or short), use htons to convert to network byte order and send to the receiving machine. On the receiving machine you would use ntohs to convert to that machine's host byte ordering, and construct a std::bitset from this value.

This is all I am going to say on this subject so please do not ask further follow-ups on the same theme, as we are drifting into areas which will take me much too long to explain and you should do the research yourself read books, look at other people's code etc...In this respect, learning to use a search engine effectively is a good move. As is learning to use the MS developer network and its library (see as a starting point).  


All Answers

Answers by Expert:

Ask Experts


Ralph McArdell


I am a software developer with more than 15 years C++ experience and over 25 years experience developing a wide variety of applications for Windows NT/2000/XP, UNIX, Linux and other platforms. I can help with basic to advanced C++, C (although I do not write just-C much if at all these days so maybe ask in the C section about purely C matters), software development and many platform specific and system development problems.


My career started in the mid 1980s working as a batch process operator for the now defunct Inner London Education Authority, working on Prime mini computers. I then moved into the role of Programmer / Analyst, also on the Primes, then into technical support and finally into the micro computing section, using a variety of 16 and 8 bit machines. Following the demise of the ILEA I worked for a small company, now gone, called Hodos. I worked on a part task train simulator using C and the Intel DVI (Digital Video Interactive) - the hardware based predecessor to Indeo. Other projects included a CGI based train simulator (different goals to the first), and various other projects in C and Visual Basic (er, version 1 that is). When Hodos went into receivership I went freelance and finally managed to start working in C++. I initially had contracts working on train simulators (surprise) and multimedia - I worked on many of the Dorling Kindersley CD-ROM titles and wrote the screensaver games for the Wallace and Gromit Cracking Animator CD. My more recent contracts have been more traditionally IT based, working predominately in C++ on MS Windows NT, 2000. XP, Linux and UN*X. These projects have had wide ranging additional skill sets including system analysis and design, databases and SQL in various guises, C#, client server and remoting, cross porting applications between platforms and various client development processes. I have an interest in the development of the C++ core language and libraries and try to keep up with at least some of the papers on the ISO C++ Standard Committee site at


©2017 All rights reserved.