All schools across the U.S are required to teach english, but why is so important?
I believe it is important that we study english throughout all our years in school because of several reasons. One being the fact that it is important to learn more about the most commonly spoken language in the U.S. Since almost everyone in the U.S speaks english it is important that teachers teach us english as it allows and creates a more complex vocabulary. And it allows us to create greater communication with each other, as we learn to understand differently words and the feelings associated within them. Another reason is how it allows it to learn more about the past, as we learn about different forms of english we begin to understand the past much better.