This discussion has been locked.
You can no longer post new replies to this discussion. If you have a question you can start a new discussion

Enum Vs #define

Hi All,
I am being confused with the use of enum vs #define. Sometold me that enum is better and some #define. Well, as per my understanding while defining sequencial data i.e. 1,2,3,4... at that time enum is better as one doesn't has to write the number i.e
enum Month {
Jan,
Feb, ....
Dec };

But in case of random number defination, #define will be good. i.e
#define TEMPERATURE 35
#define USL 45
#define LSL 25

Which one is good with respect to memory ?
What are the advantages/disadvantages of both.

Parents
  • To a large degree, it's a matter of taste.

    I prefer enums when I have a number of related integer constants. Enums are clearly preferable when you don't care what the values are. But even when you do need to specify the values for all the constants, I like the mental grouping of an enum. Code documents itself better when you have the type, e.g.

    Error MyFunc();

    clearly returns one of a particular set of error codes, whereas

    int MyFunc()

    might return one of the #define'd list for Unix errno, or maybe something else, or maybe those plus some idiosyncratic values -- who knows? If you have more than one set of return codes, which set does this function use?

    The more specific enum type name helps the tags facility in your editor, greps, debugging, and so on.

    A strict lint may give you some warnings about using enums as integers, for example if you add or or them, or pass an enum to an int.

    A const object is different from either an enum or a #define, particularly in C. In ANSI C, a const int takes up space just as a regular int; most compilers will also generate pointer references to this address rather than inlining the value. As a result, I rarely use const int's in C. (C++ has slightly different semantics, and so the choices are different there.)

    Every compiler I've ever used has the option to store enums in the smallest space possible. Usually it's even the default option. To force wider enums when using such an option, I usually throw in an extra unsigned value:

    typedef enum
        {
        MyEnumA,
        MyEnumB,
    
        MyEnumForce16 = 7fff
        } MyEnum;
    

Reply
  • To a large degree, it's a matter of taste.

    I prefer enums when I have a number of related integer constants. Enums are clearly preferable when you don't care what the values are. But even when you do need to specify the values for all the constants, I like the mental grouping of an enum. Code documents itself better when you have the type, e.g.

    Error MyFunc();

    clearly returns one of a particular set of error codes, whereas

    int MyFunc()

    might return one of the #define'd list for Unix errno, or maybe something else, or maybe those plus some idiosyncratic values -- who knows? If you have more than one set of return codes, which set does this function use?

    The more specific enum type name helps the tags facility in your editor, greps, debugging, and so on.

    A strict lint may give you some warnings about using enums as integers, for example if you add or or them, or pass an enum to an int.

    A const object is different from either an enum or a #define, particularly in C. In ANSI C, a const int takes up space just as a regular int; most compilers will also generate pointer references to this address rather than inlining the value. As a result, I rarely use const int's in C. (C++ has slightly different semantics, and so the choices are different there.)

    Every compiler I've ever used has the option to store enums in the smallest space possible. Usually it's even the default option. To force wider enums when using such an option, I usually throw in an extra unsigned value:

    typedef enum
        {
        MyEnumA,
        MyEnumB,
    
        MyEnumForce16 = 7fff
        } MyEnum;
    

Children
No data