In my opinion, a big problem with W3C’s tabular data model is that one cannot process or validate a table in an offline process. You have to dereference some URLs to retrieve the necessary metadata. This problem alone already rules out the viability to use of the CSVW standards for a lot of use cases where connectivity is not guaranteed.
The W3C has a process which seeks to get evidence of two independent and interoperable implementations of a proposed standard before reaching recommendation status. Looking at the CSVW Implementation Report it seems that csvlint and RDF-tabular pass the compliance tests. So, to the W3C, that is enough evidence of implementation. It does not matter to them that Tabular Data Packages seemingly has got more traction in tools and community usage than CSVW.
On the other hand, in the past the W3C has abandoned XHTML 2.0, which was also an esoteric and complex proposal but built inside the W3C, in favor of HTML 5, which is simpler, more objective and pragmatic, but built by the WHATWG, outside the W3C. The reason being community usage was overwhelmingly in favor of HTML 5, so at some point the W3C decided could not ignore it any more and brought HTML 5 into its own standardization process.
Perhaps frictionless data could take a similar approach to how the WHATWG has done in the past and eventually succeed at standardizing Tabular Data Packages at the W3C.