日韩性视频-久久久蜜桃-www中文字幕-在线中文字幕av-亚洲欧美一区二区三区四区-撸久久-香蕉视频一区-久久无码精品丰满人妻-国产高潮av-激情福利社-日韩av网址大全-国产精品久久999-日本五十路在线-性欧美在线-久久99精品波多结衣一区-男女午夜免费视频-黑人极品ⅴideos精品欧美棵-人人妻人人澡人人爽精品欧美一区-日韩一区在线看-欧美a级在线免费观看

歡迎訪問 生活随笔!

生活随笔

當前位置: 首頁 > 编程资源 > 编程问答 >内容正文

编程问答

Cannot resolve field [product], input field list:[user, EXPR$0]

發布時間:2023/12/31 编程问答 32 豆豆
生活随笔 收集整理的這篇文章主要介紹了 Cannot resolve field [product], input field list:[user, EXPR$0] 小編覺得挺不錯的,現在分享給大家,幫大家做個參考.

flink的故障table api使用案例如下如下:

Table counts = Orders
? ? ? ? ? ? ? .groupBy($("user"))
? ? ? ? ? ? ? ? .select($("user"),$("product"),$("amount").sum().as("cnt"));

Exception in thread "main" org.apache.flink.table.api.ValidationException: Cannot resolve field [product], input field list:[user, EXPR$0].at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.failForField(ReferenceResolverRule.java:80)at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$3(ReferenceResolverRule.java:75)at java.util.Optional.orElseThrow(Optional.java:290)at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$null$4(ReferenceResolverRule.java:75)at java.util.Optional.orElseGet(Optional.java:267)at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.lambda$visit$5(ReferenceResolverRule.java:74)at java.util.Optional.orElseGet(Optional.java:267)at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:71)at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule$ExpressionResolverVisitor.visit(ReferenceResolverRule.java:51)at org.apache.flink.table.expressions.ApiExpressionVisitor.visit(ApiExpressionVisitor.java:31)at org.apache.flink.table.expressions.UnresolvedReferenceExpression.accept(UnresolvedReferenceExpression.java:60)at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.lambda$apply$0(ReferenceResolverRule.java:47)at java.util.stream.ReferencePipeline$3$1.accept(ReferencePipeline.java:193)at java.util.ArrayList$ArrayListSpliterator.forEachRemaining(ArrayList.java:1374)at java.util.stream.AbstractPipeline.copyInto(AbstractPipeline.java:481)at java.util.stream.AbstractPipeline.wrapAndCopyInto(AbstractPipeline.java:471)at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(ReduceOps.java:708)at java.util.stream.AbstractPipeline.evaluate(AbstractPipeline.java:234)at java.util.stream.ReferencePipeline.collect(ReferencePipeline.java:499)at org.apache.flink.table.expressions.resolver.rules.ReferenceResolverRule.apply(ReferenceResolverRule.java:48)at org.apache.flink.table.expressions.resolver.ExpressionResolver.lambda$null$1(ExpressionResolver.java:211)at java.util.function.Function.lambda$andThen$1(Function.java:88)at java.util.function.Function.lambda$andThen$1(Function.java:88)at java.util.function.Function.lambda$andThen$1(Function.java:88)at org.apache.flink.table.expressions.resolver.ExpressionResolver.resolve(ExpressionResolver.java:178)at org.apache.flink.table.operations.utils.OperationTreeBuilder.projectInternal(OperationTreeBuilder.java:191)at org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:160)at org.apache.flink.table.operations.utils.OperationTreeBuilder.project(OperationTreeBuilder.java:151)at org.apache.flink.table.api.internal.TableImpl$GroupedTableImpl.select(TableImpl.java:630)at Overview_Examples.main(Overview_Examples.java:42)

原因:

Flink SQL的table api只支持

only_full_group_by

?

邏輯:

groupy以后剩下的字段就是 groupby里面里面包含的字段。

?

解決方案:

? ? ? ? Table counts = Orders
? ? ? ? ? ? ? .groupBy($("user"),$("product"))
? ? ? ? ? ? ? ? .select($("user"),$("product"),$("amount").sum().as("cnt"));

解決方案解析:

select后面的字段必須包含在groupby中,不包含amount這種被sum處理過的.

?

為啥在mysql中不滿足上述規律也可以正常運行?

因為你的Mysql是5.7以下的.

Flink Table API使用的是mysql5.7以上的語法.

總結

以上是生活随笔為你收集整理的Cannot resolve field [product], input field list:[user, EXPR$0]的全部內容,希望文章能夠幫你解決所遇到的問題。

如果覺得生活随笔網站內容還不錯,歡迎將生活随笔推薦給好友。