Oh, that's fun.
I did a bunch of tests, using an a posteriori conlang, and an alternate history Italian. The para-Italian was designed to be very close to actual Italo-Romance which will prove to have very interesting results.
The first sample was a folk song; ChatGPT thought it was Sicilian (not a bad guess). It correctly recognized it as a folk song and then confidently analyzed it as very typical of Sicilian poetry. Bad guess -- I had translated it from a Sephardic love song, in Ladino.
The second sample was a version of
Ambarabai ciccì coccò, an Italian counting-out rhyme. ChatGPT identified the language as Neapolitan (again, that works). Its thoughts on it:
The text does not have a clear literal meaning and appears to be a playful or nonsensical phrase commonly found in children's songs or nursery rhymes.
Not bad!
It did not recognize the rhyme, though.
Next I tried an a posteriori conlang. It thought it was Kurdish, very confidently provided a translation and stated that "the text expresses themes of longing, nature, and traditional Kurdish culture, reflecting the rich oral tradition of Kurdish poetry and music."
In all three cases it provided a translation when I asked one. For the a posteriori conlang, the translation was entirely wrong, of course. For alternate Italian, the results were wrong, of course, but not so bad.
I think it uses the Google Translate API at some point in the process - Google Translate guesses exactly the same languages.
I fed it a bit of grammar. Of para-Italian it said:
The text provides a detailed explanation of the third declension in a Romance language, likely Italian or a closely related dialect. Here's a breakdown of the key points:
Completely unfazed by Italian having such a thing as a third declension
It did realize the a posteriori language was fictional and interestingly, from that point on it did realize the para-Italian was a conlang as well!
I fed it the first text again (para-Italian folk song) -- it still thought it was Sicilian. Curiously, the translation it provided was different. (Though again, not a bad one!)
Then I tried a different approach: I fed it the very same song, telling it upfront it was in a conlang:
The vocabulary used in the text doesn't correspond to any known language, indicating that it's either a completely invented lexicon or heavily modified from an existing language.
One last try: using the same song, I tell ChatGPT it's in a conlang, and ask it to translate it.
Since this text is in a fictional language, I can't provide a direct translation since I don't have access to a dictionary or grammar rules for this language. However, I can try to offer a creative interpretation of the text based on its structure and phonetics:
The translation, or interpretation is completely wrong or rather, creative, just as it warned. But rather curiously, one verse is translated correctly.