As some people already mentioned here or here, Copilot purposely stops working on code that contains hardcoded banned words from Github, such as gender or sex.
I am labelling this as a bug because this behavior is unexpected and undocumented.
I guess you might be embarrassed by what your AI says when it autocompletes gender related matters since it is probably trained on biased datasets. However disabling autocompletion is not a great solution since gender can be very present in :
- translations in many languages (not much in English though), such as latine languages where nouns do have genders
- sentences generation (for documentation, games, etc.) in those languages
- medical, administrative or demographics works
For which Copilot shutting down on most of the files is deceptive and pretty annoying.
I don’t have any elegant solution in mind and I know this seems like an edge case, but I hope this will be taken into account someday.