Florida "US football" redirects here. For the national governing body, see USA Football. Football in the United States may refer to: American football, sport American football in the United States, sport in the country Australian rules football in the United States Gaelic football in the United States Rugby league in the United States Rugby union in the United States Soccer in the United States, association football Topics referred to by the same term