The computational solution of large scale linear programming problems contains various difficulties. One of the difficulties is to ensure numerical stability. There is another difficulty of a different nature, namely the original data, contains errors as well. In this paper, we show that the effect of the random errors in the original data has a diminishing tendency for the optimal value as the number of constraints and the number of variables increase. The laws of large numbers in probability theory are mathematical formulations for indicating the slowing-down tendency of the effect of random errors in the data. This paper was inspired by the paper of Prekopa [3]. Prekopa [3] proved both weak and strong laws of large numbers for the random linear programs in independence setting. We obtain laws of large numbers under negatively associated dependence for random linear programs and we extend Prekopa's results [3] to the case of negatively associated random variables.