According to a proposed theory, the quantity & should have the value Xin-Having measured x in the usual form best ‡ o (where o is the appropriate SD), we would say that the discrepancy between best and Xin is t standard deviations, where
+ = best - Xia/. How large must t be for us to say the discrepancy is significant
at the 5% level? At the 2% level? At the 1% level?