This is an image of a college campus somewhere in the United States.
Do our colleges make the U.S. a great nation? Should college be a goal for everyone? What do you say to critics who claim colleges are places that teach our young people to hate America with extreme ideas?
Other wordless entries can be found here