Even people will yell at me like: "what are you talking about, why do you post such a crab...",
just a brainstorming about the differences when talking about declarations, definitions etc. on "any" coding language (C or C++ or any other).
The main question to ask: which statements in your source code generates really (binary, executable) code?
- Declaration
A declaration does not generate any code. It is just there to "declare" a type, interface or reference (e.g. as external)
extern void myFunction(void);
does not generate any code. It just tells the compiler: "hey, there is an external function which has this "signature". And compiler can check if you call this myFunction properly (here, with no arguments and not expecting a return).
- Definition
You can also define new types, new class, new structures etc. - but it will also not yet create code (space in memory).
Example:
class MyClass {
public:
MyClassFunction();
}
MyClass::MyClassFunction() {
//... do something - have code here
}
Even you provide already code for MyClassFunction() - it is not necessarily generated (not yet available, no space yet in memory).
- Instantiation
Just if you instantiate a class, you create an object, e.g. via
MyClass myObject;
you will generate code.
A compiler can have several reasons to "complain":
-
a declaration mismatches with an existing declaration, e.g. the parameter list of a function is not identical (not the same "signature")
-
you define something which is already defined: the type, class, variable etc. is already there.
It can also happen, compiler accepts your definition but it finds a similar one in a library (LIB) used: it has no idea which one to use. -
it cannot be instantiated
In object oriented languages, like C++ there is a concept of "virtual": it can force you as: "OK, you can reuse me, but you have to provide the implementation for some stuff, e.g. member functions".
So, all might look OK, but you cannot create an object (an instance) of a class. It has still "undefined" references, e.g. virtual methods.
So, at the end:
You have to make your compiler happy: mainly it starts with following the "prototypes" (the declarations). You can only use what was declared, in the same way as declared.
You cannot use anything what was not defined. Obvious: using a variable which does not exist (was not defined) cannot be used.
And objects do exist just after instantiation: a defined class does not create yet an instance. Objects come from classes (or structures) and need an instantiation, even the definition is OK.
Just to think about what "happens behind the scene" (for a compiler). Like in human communication:
-
let's declare some naming, acronyms, types, syntax before we keep talking
-
let's define what we want to have, using our declarations (nothing can be defined if attributes of it were not declared, e.g. a type)
-
let's generate code: just instances of defined objects can generate code. We had to create an instance of something defined.
This might help you to understand where a compiler comes from: a declaration is missing (or conflicts witch an existing one), a definition was missing (before creating and using an instance) or the instantiation fails (due to a missing definition). Like a three-level-stages to understand a complete sentence in the same correct way: "have same syntax, have same semantic, have same resulting action".
There is no need to understand how compilers work (behind the scene). But there is a need to understand to be unambiguous, clear, precise and "complete" in order to let the compiler do what you want to get. If an error is fired: it was not clear for the compiler "what do you mean or want to do."
Just start to think like:
-
was a type wrong? (a mismatch with existing prototypes, declarations or compiler expectations)
-
was there a use of something undefined, unknown, not able to resolve via #include?
-
is there a statement which generates really code?
e.g. using a function which cannot be found, using an object which was not created?