Technology often works silently as a double-edged sword. On the one hand, it has the power to amplify inequality, reinforce existing stereotypes, and further push people into categories that do not represent them. On the other, its design may help devise strategies and counternarratives that redirect ongoing discourses towards a fairer and more inclusive society. However, giving voice to and operationalizing these reflections is difficult, especially in new complex fields such as robotics and artificial intelligence (AI). To this aim, this book collects the reflections, insights, and tools resulting from the Diversity, Equity, and Inclusion for Embodied AI (DEI4EAI) project. This book is intended for students, researchers, designers, developers, and societal stakeholders working with embodied AI and interested in contributing to more equitable and just futures. All those things dubbed as ordinary are in fact, so cultural: they represent values, beliefs, and narratives that influence how we collect and use data, how we craft algorithms, how we define agency, how we mold AI embodiment, how we design interaction, and how we define embodied AI intervention. Although in different roles and capacities, designers, researchers, and broader stakeholders like policymakers and communities are responsible for reflecting on how their values, perspectives, biases, and stereotypes may affect embodied AI technology. This is important because siloed practices influence our capacity to assess the risks and harms of our actions. To prevent designing harmful and inadequate technology, there is the need to inspect narratives, practices, and methods with reflexivity and openness to shift mindsets.