Investigation of systemic biases in AI models for the clinical domain have been limited. We re-created a series of models predicting need of wraparound services, and inspected them for biases across age, gender and race using the AI Fairness 360 framework. AI models reported performance metrics which were comparable to original efforts. Investigation of biases using the AI Fairness framework found low likelihood that patient age, gender and sex are introducing bias into our algorithms.