Does anyone know what may cause the read-back values of array[n] and array[x] (x=n) different from each other?
EDIT: Following is a compilable code to illustrate the problem I encountered. If you run the following code, you won't see any problem. I am just using it to describe the problem I saw in my original program which is a simulator with 100+ classes.
#include <iostream>
using namespace std;
class MySimulatorImplementationBy100Classes;
class MySimulator {
public:
enum SigList {
SIG0,
SIG1,
SIG2,
// ...
SIG18=18,
SIG19=19,
SIG70=70,
SIG80=80
};
void run() {}
private:
MySimulatorImplementationBy100Classes *impl_;
};
int main() {
MySimulator sim;
// set up the simulation
sim.run();
enum SigAnalysis {
ANALYSIS0,
ANALYSIS1,
ANALYSIS2,
ANALYSIS3,
ANALYSIS4,
ANALYSIS5,
ANALYSIS6,
NUM_ANALYSIS
};
const MySimulator::SigList signal_source[NUM_ANALYSIS] = {
MySimulator::SIG18,
MySimulator::SIG18,
MySimulator::SIG18,
MySimulator::SIG18,
MySimulator::SIG70,
MySimulator::SIG80,
MySimulator::SIG19
};
for(int i=0; i<NUM_ANALYSIS; ++i) {
cout <<signal_source[i]<<"\t" << MySimulator::SIG80 <<"\t" << signal_source[5] << "\t" << i << "\t";
cout << &signal_source[i]<<"\t" << &signal_source[5]<< "\n";
}
}
In my program, the outputs are
18 80 80 0 0xfffe1c90 0xfffe1ca4
18 80 80 1 0xfffe1c94 0xfffe1ca4
18 80 80 2 0xfffe1c98 0xfffe1ca4
18 80 80 3 0xfffe1c9c 0xfffe1ca4
70 80 80 4 0xfffe1ca0 0xfffe1ca4
173068832 80 80 5 0xfffe1ca4 0xfffe1ca4
168047112 80 80 6 0xfffe1ca8 0xfffe1ca4
While I expect it to be
18 80 80 0 0xfffe1c90 0xfffe1ca4
18 80 80 1 0xfffe1c94 0xfffe1ca4
18 80 80 2 0xfffe1c98 0xfffe1ca4
18 80 80 3 0xfffe1c9c 0xfffe1ca4
70 80 80 4 0xfffe1ca0 0xfffe1ca4
80 80 80 5 0xfffe1ca4 0xfffe1ca4
19 80 80 6 0xfffe1ca8 0xfffe1ca4
I don't know why signal_source[i]
when i=5 returns something different from signal_source[5]
, while &signal_source[i]
and &signal_source[5]
are identical.
Furthermore, if I add some dummy codes somewhere before the for
loop of cout
sim.run();
// dummy print
cout << "dummy\n";
enum SigAnalysis { //...
The results will change:
dummy
172402312 80 80 0 0xfffe1c90 0xfffe1ca4
172446752 80 80 1 0xfffe1c94 0xfffe1ca4
18 80 80 2 0xfffe1c98 0xfffe1ca4
18 80 80 3 0xfffe1c9c 0xfffe1ca4
70 80 80 4 0xfffe1ca0 0xfffe1ca4
80 80 80 5 0xfffe1ca4 0xfffe1ca4
19 80 80 6 0xfffe1ca8 0xfffe1ca4
Does anyone have a clue what may go wrong here?
I am using gcc ver 3.4.6. Thanks for the help!