Would you feel the presence of God or know anything about him if you hadn't been taught about him? If you had been brought up in a society where religious material did not exist and you had no knowledge of God, then would he exist for you? Because you have been taught about God, and you believe in him, and say you feel his presence, isn't that like a placebo effect? Which alone could disprove the existence of God?
I'm pretty agnostic myself. I see religion as a way of positive thinking, that some people get depressed thinking they are alone in this world of doom, gloom and shrooms and have to believe that they are being guided in some way.
The fact that so many people believe in the existence of a God complicates the argument, but that number has decreased dramatically over the millennia. As our knowledge and learning has increased, other supernatural theories have become obsolete and no longer required to satisfy our human need for explaining everything whether we know it or not. Because we can scientifically explain a lot more nowadays, religions explanations are not needed. The question of "Why are we here, how did we come to be?" is pretty much beyond anyone's comprehension, so we conjure a higher entity, or "God" as our reason for being. All because, truly, we don't have a fucking clue.