I have a little experience in statistics. If you have a population of 500, you would have to randomly select 83 to have precision rate of +/- 10% at a 95% confidence level. If you wanted to be +/- 5% at a 95% confidence level, you would need to sample 222.
I know it does't make intuitive sense, but a sample of 1,111 of our US population gives you a precision rate of +/- 3% at a 95% confidence level.
A 95% confidence interval for a population mean(average) involves finding the max. error of estimate E which is then added and subtracted to the sample mean to obtain the interval. E is found by the following calculation:
E = 1.96 * s/square root of n
where n is the sample size and s is the standard deviation of the sample.
Excel has the capability of finding the mean and s for a set of data which is highlighted using the statistical functions therein.
Hypothetical example: mean is 3. E= .9 Interval would be 2.1 to 3.9 where one can be 95% sure that the true population mean will fall.
Probably a good idea to throw out any outliers such as an outlet which may have sold 12 where nobody else sold more than 7. A sample size of 50 chosen RANDOMLY would probably lead to a reasonable value.