We as consumers hear a lot about buying organic fruits and veggies these days. But is organic really better? The answer is a resounding yes! Not only does organic produce tend to taste better, it’s better for our bodies and the environment as well. In order for a food to be deemed “organic,” the USDA set standards for organic agricultural …