I have a simple question. I am no mathematician and I need to understand this. I am using a spreadsheet to figure out different variables in changing a person?s salary. I have two numbers, one salary before a law change and one salary after a law change. I want to show the percentage the law changed the value of the salary.
So before the law change, the salary was worth $20,418.75. After the law change, the salary was worth $96,375.19. So to find how much the law changed the salary, I did a formula: ((96,375.19 - 20,418.75)/20,418.75)*100. I get 371.9936, which I take to mean a 372% increase. I also did the reverse: ((20,418.75 - 96,375.19)/96,375.19)*100 and got -78.8132702 which I take to mean if it had happened in reverse order, the salary would?ve gone down about 79%. I don?t understand. Why isn?t it going both ways ? shouldn?t it either go up or down the same number? i.e. shouldn?t the salary go up 372% or down 372% depending on the chronological order? How could it be 372% or 79%? Also, can someone confirm I am correct when I say: by increasing from $20,418.75 to $96,375.19, the salary increased by 372%?
I am doing a spreadsheet full of these percent changes over time to show the percentage numbers went up or down after law changes and must be accurate. Any help on ensuring accurate percent changes over time would be great.