How can American Realism best be defined?
Explanation
American Realism focuses on portraying life accurately and truthfully, without romanticizing or idealizing it. Unlike romanticism, it aims to represent everyday experiences and social realities as they are.