Hi All,
I am currently building a web application that actively calculates the Mean Bias Error between selected sets of data.
The scenario is as follows:
1. Based on user input, two columns of MySQL data are selected: One with 288 experimental data values and a second with 288 benchmark values.
2. Then a new set of 288 values is generated, where the values correspond to: [experimental(x) - benchmark(x)]/benchmark(x), where x goes from 1 to 288.
3. Finally, the values in the new set are summed, divided by the # of values in the set and multiplied by 100.
I understand how to program the first and third steps without a problem, however I haven't managed to write or find a loop function to produce the set of values required in 2.
Any suggestions as to how I could go about this would be greatly appreciated,
Thanks,
Tiffany Otis
tiffany.otis@yahoo.com