HP 40gs HP 39gs_40gs_Mastering The Graphing Calculator_English_E_F2224-90010.p - Page 132

rounding error may result in something non-zero., The alternative to using

Page 132 highlights

Alternatively, when data is non-linear in nature you can transform the data mathematically so that it is linear. Let's illustrate this briefly with exponential data. As you can see, I chose a very simple rule for the data of y = 2x . If you set up a linear fit for the data in S1, and then view the bivariate stats, you will find that the correlation for a linear fit is 0.9058 As you can easily see from the graph left, a linear fit is not a very good choice. If we change now to the SYMB SETUP view and choose an Exponential fit rather than a linear fit then the results are far better. The curve which results in the PLOT view is exactly what is required and the equation comes out as Y = 1⋅ EXP(0.693147 X ) This "EXP(" is the calculator's notation for Y = 1⋅ e 0.693147X which then changes to Y = 2X . Checking the key shows that the correlation is unchanged at 0.9058 even when the new equation clearly fits the data perfectly. The value of RelErr on the other hand has changed from 0.09256 for the linear fit, to a value very close to zero for the exponential model (rounding error may result in something non-zero). The alternative to using RelErr is to graph column C1 against ln(C2) which also straightens the data. 'Linearizing' will cause problems if some of the data points are outside the domain of the function you use, such as negative values in a log function. On the other hand, you have far more control if you are able to choose the exact function. For example, if you had a set of data which was derived from cooling temperatures then you would probably find that it was asymptotic to room temperature rather than the x-axis. The built-in equation assumes that the data is asymptotic to the x axis and would not give a good fit. You could get better results by subtracting a constant from the whole column first. 132

  • 1
  • 2
  • 3
  • 4
  • 5
  • 6
  • 7
  • 8
  • 9
  • 10
  • 11
  • 12
  • 13
  • 14
  • 15
  • 16
  • 17
  • 18
  • 19
  • 20
  • 21
  • 22
  • 23
  • 24
  • 25
  • 26
  • 27
  • 28
  • 29
  • 30
  • 31
  • 32
  • 33
  • 34
  • 35
  • 36
  • 37
  • 38
  • 39
  • 40
  • 41
  • 42
  • 43
  • 44
  • 45
  • 46
  • 47
  • 48
  • 49
  • 50
  • 51
  • 52
  • 53
  • 54
  • 55
  • 56
  • 57
  • 58
  • 59
  • 60
  • 61
  • 62
  • 63
  • 64
  • 65
  • 66
  • 67
  • 68
  • 69
  • 70
  • 71
  • 72
  • 73
  • 74
  • 75
  • 76
  • 77
  • 78
  • 79
  • 80
  • 81
  • 82
  • 83
  • 84
  • 85
  • 86
  • 87
  • 88
  • 89
  • 90
  • 91
  • 92
  • 93
  • 94
  • 95
  • 96
  • 97
  • 98
  • 99
  • 100
  • 101
  • 102
  • 103
  • 104
  • 105
  • 106
  • 107
  • 108
  • 109
  • 110
  • 111
  • 112
  • 113
  • 114
  • 115
  • 116
  • 117
  • 118
  • 119
  • 120
  • 121
  • 122
  • 123
  • 124
  • 125
  • 126
  • 127
  • 128
  • 129
  • 130
  • 131
  • 132
  • 133
  • 134
  • 135
  • 136
  • 137
  • 138
  • 139
  • 140
  • 141
  • 142
  • 143
  • 144
  • 145
  • 146
  • 147
  • 148
  • 149
  • 150
  • 151
  • 152
  • 153
  • 154
  • 155
  • 156
  • 157
  • 158
  • 159
  • 160
  • 161
  • 162
  • 163
  • 164
  • 165
  • 166
  • 167
  • 168
  • 169
  • 170
  • 171
  • 172
  • 173
  • 174
  • 175
  • 176
  • 177
  • 178
  • 179
  • 180
  • 181
  • 182
  • 183
  • 184
  • 185
  • 186
  • 187
  • 188
  • 189
  • 190
  • 191
  • 192
  • 193
  • 194
  • 195
  • 196
  • 197
  • 198
  • 199
  • 200
  • 201
  • 202
  • 203
  • 204
  • 205
  • 206
  • 207
  • 208
  • 209
  • 210
  • 211
  • 212
  • 213
  • 214
  • 215
  • 216
  • 217
  • 218
  • 219
  • 220
  • 221
  • 222
  • 223
  • 224
  • 225
  • 226
  • 227
  • 228
  • 229
  • 230
  • 231
  • 232
  • 233
  • 234
  • 235
  • 236
  • 237
  • 238
  • 239
  • 240
  • 241
  • 242
  • 243
  • 244
  • 245
  • 246
  • 247
  • 248
  • 249
  • 250
  • 251
  • 252
  • 253
  • 254
  • 255
  • 256
  • 257
  • 258
  • 259
  • 260
  • 261
  • 262
  • 263
  • 264
  • 265
  • 266
  • 267
  • 268
  • 269
  • 270
  • 271
  • 272
  • 273
  • 274
  • 275
  • 276
  • 277
  • 278
  • 279
  • 280
  • 281
  • 282
  • 283
  • 284
  • 285
  • 286
  • 287
  • 288
  • 289
  • 290
  • 291
  • 292
  • 293
  • 294
  • 295
  • 296
  • 297
  • 298
  • 299
  • 300
  • 301
  • 302
  • 303
  • 304
  • 305
  • 306
  • 307
  • 308
  • 309
  • 310
  • 311
  • 312
  • 313
  • 314
  • 315
  • 316
  • 317
  • 318
  • 319
  • 320
  • 321
  • 322
  • 323
  • 324
  • 325
  • 326
  • 327
  • 328
  • 329
  • 330
  • 331
  • 332
  • 333
  • 334
  • 335
  • 336
  • 337
  • 338
  • 339
  • 340
  • 341
  • 342
  • 343
  • 344
  • 345
  • 346
  • 347
  • 348
  • 349
  • 350
  • 351
  • 352
  • 353
  • 354
  • 355
  • 356
  • 357
  • 358
  • 359
  • 360
  • 361
  • 362
  • 363
  • 364
  • 365
  • 366

Alternatively, when data is non-linear in nature you can transform the data mathematically so that it
is
linear.
Let's illustrate this briefly with exponential data.
As you can see, I chose a very simple rule for the data of
y
=
2
x
.
If you set up a linear fit for the data in
S1
, and then view the bivariate
stats, you will find that the correlation for a linear fit is 0.9058
As you can easily see from the graph left, a linear fit is not a very good
choice.
If we change now to the
SYMB SETUP
view and choose an
Exponential fit rather than a linear fit then the results are far better.
The curve which results in the
PLOT
view is exactly what is required and
1
the equation comes out as
Y
=
EXP
(0.693147
X
)
1
0.693147
X
This “
EXP(
“ is the calculator’s notation for
Y
=
e
which then changes to
Y
=
2
X
.
key shows that the correlation is unchanged at
0.9058 even when the new equation clearly fits the data perfectly.
Checking the
The value of
RelErr
on the other hand has changed from 0.09256 for
the linear fit, to a value very close to zero for the exponential model
(rounding error may result in something non-zero).
The alternative to using
RelErr
is to graph column
C1
against
ln(C2)
which also straightens the data.
‘Linearizing’ will cause problems if some of the data points are outside the domain of the function you use,
such as negative values in a log function.
On the other hand, you have far more control if you are able to
choose the exact function.
For example, if you had a set of data which was derived from cooling
temperatures then you would probably find that it was asymptotic to room temperature rather than the x-axis.
The built-in equation assumes that the data is asymptotic to the x axis and would not give a good fit. You
could get better results by subtracting a constant from the whole column first.
132