My apologies for taking so long to answer to this subject, but I plead to being old, slow thinking, and generally thorough.
Lately I have succeeded in creating a number of basic standard probability scales as usually defined: more than zero, through to, and including, 1. When I took Stat 101, someone forgot to tell me that these are merely midpoint interval scales for all points except for the 1.
I have always taken an answer like 527 heads resulting from 1,000 coin tosses as an actual 0.043852515 one-tail probability occurrence in the infinite scheme of the universe. It turns out this probability amount is merely a midpoint scale amount. The actual comparable one-tail probability of 527 heads occurring is 0.005874295. This is a difference of almost 7 1/2 times. May I be allowed to call this a significant difference?
Now this is mostly statistics talk so far. I think us ordinary humans would better understand these results in the form of odds against. In this example, the generally accepted scale amount translates to almost 22:1 odds against being by mere chance. Since I am an ordinary human, I take that as being pretty chancy, but maybe possible. I think I would ask for the coin flips to be repeated to see what happens for the next 1,000.
But the actual odds are 169:1. You have little chance convincing me these 527 heads are an actual fair result.