Foreign relations of the United States are the country's bilateral relations with other countries.
Foreign relations of the United States may also refer to:
Foreign relations of the United States are the country's bilateral relations with other countries.
Foreign relations of the United States may also refer to: