Culturally, when americans hear the words "civil war", they think of the great battles fought in Virginia, such as Bull Run, Seven Days, Antietam, Fredericksburg, Chancellorsville, Gettysburg, and the Overland Campaign. However, personally, I believe the fate of the south was sealed in the other area of the war.
I believe the war fought along the Mississippi river, in Tennessee and Georgia was more decisive. Such actions such as the March to the Sea, Vicksburg, and other actions were more decisive then the dramatic battles fought in the east. In many ways, the War was won in the west. The Confederates were winning the war in Virginia until the Overland Campaign, in 1864. Despite this, they had been losing everywhere else since 1862.
To put it in perspective, it wouldn't really matter what happens in the east if the north wins in the west. The only way the east would really change anything is if Lee well and fully destroys the Army of the Potomac. Not forcing it to retreat, but annihilating it. When reversed, it doesn't really matter if your losing in Virginia if your holding off the North in the west. Virginia was only really important because of Richmond, which was simply a political target. I find it very hard to believe that the entire southern army would just go belly up and surrender if it lost Richmond.
These are my views, what do you think?