Office Space

Discover How REITs Benefit the Working World

What is a REIT?
A REIT, or Real Estate Investment Trust, is a company that owns, operates or finances income-producing real estate. Established by Congress in 1960 and modeled after mutual funds, REITs provide Americans the chance to own valuable real estate, present the opportunity to access dividend-based income and total returns and help communities grow, thrive and revitalize.

Learn how REITs benefit your work life
Scroll
Scroll
REITs put people to work.
REITs put people to work.
REITs allow anyone to invest in portfolios of large-scale properties the same way they invest in other industries – through the purchase of stocks. REITs own a wide range of real estate ranging from office buildings and shopping malls to cell towers and data centers. REITs own property across all 50 states and provide a significant impact to the national economy and local communities. REIT-owned properties not only help enhance the quality of life in the surrounding communities but they have also provided new jobs, better infrastructure, increased economic activity and other improvements.
Learn more at thereitway.com

DESIGNED AND DEVELOPED BY

Toolbox No. 9