Answer by
JoeJohnson (41)
Whole Foods is a great organic food store in the United States. They revolutionized the organic food market and took it from a small mom and pop mentality to the corporate world. Some other great organic food stores include Trader Joes and Wegmans. There is a trend in moving towards organic food sections so most stores will have organic options.