Fixing gender and racial gaps is perhaps more crucial than first thought.
Studies from MIT and Grant Thornton show that diversity can help improve performance at every level of the company, from the bottom line to the boardroom.
According to Grant Thornton’s research, companies with diversity on their executive boards can help to outperform firms with all-male leaderships. As a physicist, my approach would be to make an analogy with AI (artificial intelligence).
Machine learning theory is based on the assumption that your training set is a good approximation of your true sample distribution. There is no guarantee of accuracy within the results if that assumption is violated. Meaningful examples are Beauty.ai and Microsoft’s chatbot named ‘Tay’.
Beauty.ai is an international beauty contest judged purely by AI. The controversy started when the machine seemed to select mostly white contestants, whilst darker skin tones and Asian features seemed to be set aside by the robots.
Following Beauty.ai’s Chief Science Officer Alex Zhavoronkov, “The main problem was that the data the project used to establish standards of attractiveness did not include enough minorities”.
The explanation suggests that if the distribution of out-of-sample data differs significantly from the distribution of the data that the model was trained or tested on, the model can drastically fail.
Tay, Microsoft’s chatbot on Twitter, turned into a genocide-supporting Nazi and was shut down 16 hours after it was let loose on the internet. Tay’s artificial brain purely absorbed the nasty tweets she received as training set and subsequently processed it to respond to other web users. In this case again the quality of the data set, or the bias, is to be blamed.
Following data from ILOSTAT  the median female share of the workforce is about 45%. The issue of underrepresentation and pay disparity for women in several sectors is no secret. Males and females seem to almost share low-to-mid level management roles in the US, women have lagged behind in executive positions however.
The percentage of women in senior roles is gradually growing worldwide, but at the slow rate of a 3% increase from 2011 to 2016 , parity is not expected to be achieved until 2060. In the EU (largest publicly listed companies), only 15% of executives and 5% of CEOs are women .
Even in machine learning it is a challenge to adjust unwanted biases. “Just as a traumatic childhood accident can cause lasting behavioral distortion in adults, so can unrepresentative events cause machine-learning algorithms to go off course”, said Tobias Bear and Vishnu Kamalnath in their analysis on controlling biases in machine learning .
Given that the percentage of female executives or politicians is not representative of the workforce, what kind of biases could it lead to in decision making and orientations?
That being said, do you think that the lack of diversity would be like working with biased data, influencing the predictions it makes?
 European Commission, “Executives and Non-Executives”, Women and Men in Decision-Making Database (2016).