Can I use the term 'America' to signify just the United States?
Solution 1:
It depends where you are writing for. In Europe, Asia and Oceania generally speaking yes, America denotes the USA unless otherwise qualified. However, in North and South America that is not the case, and in fact it would be rude to consider it so. Under those circumstances a more qualified term would be appropriate.
In Britain and Australia, the Wall Street Journal is an American newspaper, but in Canada it is a US newspaper.
EDIT: I should say it is always correct to say US. So the Wall Street Journal is also a US newspaper in Britain and Australia. So it is a safe default.
Solution 2:
The safest mode to take would be to refer to the US as the US rather than just "America".
It's been my experience that most Canadians and Mexicans do not appreciate being referred to as "Americans" - even though they are North Americans, while some folks in South and Central America will call themselves "American" because the US isn't the only country on the continent and shouldn't be the only ones referred to as "American".