Higher kinded types reduce complexity in the same way generics reduce code complexity. If you think one makes things simpler but not the other, I would assume you're not familiar with it?
Do you have examples of why it would add significant complexity? I use them constantly and it really simplifies my code a lot!
Also, I don't believe 'profunctor optics' is a paper, I think it's just one (interesting) implementation of optics, which are useful in any language that encourages immutability.
The complexity that HKT brings to F# is not just conceptual. It needs to exist in the CLR which will bring implementation tradeoffs.
"Cool paper but no thanks" is a saying for when people come to you with an extremely new technique and want you to incorporate it at the language level with very little proofing in the field.
"Profunctor optics" is the a description of how profunctors encode lenses and similar structures. I am aware. I also think that technique can cook in Haskell land for a few more years before non-research projects should uptake it.
Thanks for replying, yeah I suppose I didn't think about the ramifications of having to deal with reified generics and higher kinded types. I still think they're an amazing productivity boost, and the initial 'weirdness' of learning that particular abstraction pays off a tremendous amount in code readability.
As for the 'cool paper' comment, I wasn't aware of this new jargon, thanks for informing heh. I'm not sure how well it will play outside of haskell, I've done a Scala encoding of them and it's a bit more awkward than other representations.
I'm sorry, but implicit slippery slope arguments are not a means by which one can force every functional language to run the project the way Haskell does.
But it's also worth noting that this is not a rejection of the entire concept of lenses, just that right now Haskell's implementations leave a lot to be desired (e.g., abandon all error sanity, ye who enter here).
Haskell is as much a research platform as a programming language, and as such it has the leisure to experiment this way. Everyone else will wait awhile for things to bake. Look what happened when Haskell built its entire stdlib on monad transformers and then we all realized they're awful compared to the alternatives.
I mostly agree with this. But nobody is "forcing" anything, one can still use Haskell and avoid the e.g. infamous (and non-idiomatic) lens library.
But just as you haven't given up on lenses, I also don't agree with the wholesale rejection of ideas based on imperfect implementations, which is what prompted my comment.
They're really good at saying "No" to things that offer at best incremental improvements at the cost of significant complexity.
"Profunctor Optics" is a great example. Cool paper, but no thanks.