NotSerializableException:Google Cloud Dataflow管道中的org.apache.avro.io.DecoderFactory

时间:2021-08-17 15:35:29

I'm building an example Dataflow pipeline, mainly based on the code at https://cloud.google.com/dataflow/java-sdk/combine

我正在构建一个示例Dataflow管道,主要基于https://cloud.google.com/dataflow/java-sdk/combine上的代码

But when I run my code, I experience the following exception:

但是当我运行我的代码时,我遇到以下异常:

Exception in thread "main" java.lang.IllegalArgumentException: unable to serialize com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$TestCombineDoFn@139982de at com.google.cloud.dataflow.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:51) at com.google.cloud.dataflow.sdk.util.SerializableUtils.ensureSerializable(SerializableUtils.java:81) at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.ensureSerializable(DirectPipelineRunner.java:784) at com.google.cloud.dataflow.sdk.transforms.ParDo.evaluateHelper(ParDo.java:1025) at com.google.cloud.dataflow.sdk.transforms.ParDo.evaluateSingleHelper(ParDo.java:963) at com.google.cloud.dataflow.sdk.transforms.ParDo.access$000(ParDo.java:441) at com.google.cloud.dataflow.sdk.transforms.ParDo$1.evaluate(ParDo.java:951) at com.google.cloud.dataflow.sdk.transforms.ParDo$1.evaluate(ParDo.java:946) at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.visitTransform(DirectPipelineRunner.java:611) at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:200) at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196) at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196) at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196) at com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196) at com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:109) at com.google.cloud.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:204) at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$Evaluator.run(DirectPipelineRunner.java:584) at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:328) at com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:70) at com.google.cloud.dataflow.sdk.Pipeline.run(Pipeline.java:145) at com.google.cloud.dataflow.examples.CalcMeanExample.main(CalcMeanExample.java:50) Caused by: java.io.NotSerializableException: org.apache.avro.io.DecoderFactory at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184) at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548) at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509) at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432) at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178) at java.io.ObjectOutputStream.defaultWriteFields(ObjectOutputStream.java:1548) at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509) at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432) at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178) at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348) at com.google.cloud.dataflow.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils.java:47) ... 20 more

线程“main”中的异常java.lang.IllegalArgumentException:无法在com.google.cloud.dataflow.sdk.util.SerializableUtils.serializeToByteArray(SerializableUtils)中序列化com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner$TestCombineDoFn@139982de .java:51)com.google.cloud.dataflow.sdk.util.SerializableUtils.ensureSerializable(SerializableUtils.java:81)com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner $ Evaluator.ensureSerializable(DirectPipelineRunner.java) :784)在com.google.cloud.dataflow.sdk.transforms.ParDo.evaluateHelper(ParDo.java:1025)com.google.cloud.dataflow.sdk.transforms.ParDo.evaluateSingleHelper(ParDo.java:963)at at com.google.cloud.dataflow.sdk.transforms.ParDo.access $ 000(ParDo.java:441)com的com.google.cloud.dataflow.sdk.transforms.ParDo $ 1.evaluate(ParDo.java:951)。 go.com.cloud.dataflow.sdk.transforms.ParDo $ 1.evaluate(ParDo.java:946)com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner $ Evaluator.visitTransform(DirectPipelineRunn) er.java:611)com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:200)com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java: 196)com的com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196)com的com.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196)位于com.google.cloud的com.google.cloud.dataflow.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:109)上的.google.cloud.dataflow.sdk.runners.TransformTreeNode.visit(TransformTreeNode.java:196) com.google.cloud.dataflow.sdk上的com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner $ Evaluator.run(DirectPipelineRunner.java:584)上的.dataflow.sdk.Pipeline.traverseTopologically(Pipeline.java:204) .runners.DirectPipelineRunner.run(DirectPipelineRunner.java:328)com.google.cloud.dataflow.sdk.runners.DirectPipelineRunner.run(DirectPipelineRunner.java:70)at com.google.cloud.dataflow.sdk.Pipeline.ru n(Pipeline.java:145)位于com.google.cloud.dataflow.examples.CalcMeanExample.main(CalcMeanExample.java:50)引起:java.io.NotSerializableException:java的org.apache.avro.io.DecoderFactory。 io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1184)java.io.ObjectOutputStream.writeFields(ObjectOutputStream.java:1548),位于java.io.ObjectOutputStream.writeOrdinaryObject的java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1509) (ObjectOutputStream.java:1432)java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)java.io.ObjectOutputStream.writeFields(ObjectOutputStream.java:1578)java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java): 1509)at java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1432)at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1178)at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:348)at com。 google.cloud.dataflow.sdk.util.SerializableUtils。 serializeToByteArray(SerializableUtils.java:47)......还有20多个

My code is as follows:

我的代码如下:

package com.google.cloud.dataflow.examples;

import java.io.Serializable;

import com.google.cloud.dataflow.sdk.Pipeline;
import com.google.cloud.dataflow.sdk.coders.AvroCoder;
import com.google.cloud.dataflow.sdk.coders.DefaultCoder;
import com.google.cloud.dataflow.sdk.coders.StringUtf8Coder;
import com.google.cloud.dataflow.sdk.io.TextIO;
import com.google.cloud.dataflow.sdk.options.DataflowPipelineOptions;
import com.google.cloud.dataflow.sdk.options.Default;
import com.google.cloud.dataflow.sdk.options.DefaultValueFactory;
import com.google.cloud.dataflow.sdk.options.Description;
import com.google.cloud.dataflow.sdk.options.PipelineOptions;
import com.google.cloud.dataflow.sdk.options.PipelineOptionsFactory;
import com.google.cloud.dataflow.sdk.transforms.Combine;
import com.google.cloud.dataflow.sdk.transforms.Combine.CombineFn;
import com.google.cloud.dataflow.sdk.transforms.DoFn;
import com.google.cloud.dataflow.sdk.transforms.ParDo;
import com.google.cloud.dataflow.sdk.util.gcsfs.GcsPath;
import com.google.cloud.dataflow.sdk.values.PCollection;


public class CalcMeanExample 

{

{

public static void main(String[] args) 
{
    Options options = PipelineOptionsFactory.fromArgs(args).withValidation().as(Options.class);
    Pipeline p = Pipeline.create(options);

    PCollection<String> numbers = p.apply(TextIO.Read.named("ReadLines").withCoder(StringUtf8Coder.of()).from(options.getInput()));

    numbers.apply( ParDo.of( new DoFn<String,String>(){
        @Override
        public void processElement(DoFn<String, String>.ProcessContext c) throws Exception {

            System.out.println( c.element() );

        }
    }));

    PCollection<String> average = numbers.apply( Combine.globally( new AverageFn()));


    average.apply(TextIO.Write.named("WriteAverage")
            .to(options.getOutput())
            .withNumShards(options.getNumShards()));

    p.run();

    System.out.println( "done" );
}


public static class AverageFn extends CombineFn<String, AverageFn.Accum, String> {
        @DefaultCoder(AvroCoder.class)   
        public static class Accum implements Serializable {
         int sum = 0;
         int count = 0;
       }
       public Accum createAccumulator() { return new Accum(); }
       public void addInput(Accum accum, String input) {
           accum.sum += Integer.parseInt(input );
           accum.count++;
       }
       public Accum mergeAccumulators(Iterable<Accum> accums) {
         Accum merged = createAccumulator();
         for (Accum accum : accums) {
           merged.sum += accum.sum;
           merged.count += accum.count;
         }
         return merged;
       }
       public String extractOutput(Accum accum) {
         return Double.toString( ((double) accum.sum) / accum.count );
       }
     }



  /**
   * Options supported by {@link WordCount}.
   * <p>
   * Inherits standard configuration options.
   */
  public static interface Options extends PipelineOptions {
    @Description("Path of the file to read from")
    @Default.String("gs://dataflow-samples/shakespeare/kinglear.txt")
    String getInput();
    void setInput(String value);

    @Description("Path of the file to write to")
    @Default.InstanceFactory(OutputFactory.class)
    String getOutput();
    void setOutput(String value);

    /**
     * Returns gs://${STAGING_LOCATION}/"sorts.txt" as the default destination.
     */
    public static class OutputFactory implements DefaultValueFactory<String> {
      @Override
      public String create(PipelineOptions options) {
        DataflowPipelineOptions dataflowOptions = options.as(DataflowPipelineOptions.class);
        if (dataflowOptions.getStagingLocation() != null) {
          return GcsPath.fromUri(dataflowOptions.getStagingLocation())
              .resolve("sorts.txt").toString();
        } else {
          throw new IllegalArgumentException("Must specify --output or --stagingLocation");
        }
      }
    }

     /**
     * By default (numShards == 0), the system will choose the shard count.
     * Most programs will not need this option.
     */
    @Description("Number of output shards (0 if the system should choose automatically)")
    @Default.Integer(1)
    int getNumShards();
    void setNumShards(int value);
  }     

}

}

Any thoughts on what would be causing this?

有什么想法导致这个?

1 个解决方案

#1


1  

We're aware of this issue and are working on a fix which should be available soon.

我们已经意识到这个问题,正在努力解决这个问题。

For now, you should be able to use SerializableCoder rather than AvroCoder for the accumulator.

目前,您应该能够使用SerializableCoder而不是AvroCoder作为累加器。

#1


1  

We're aware of this issue and are working on a fix which should be available soon.

我们已经意识到这个问题,正在努力解决这个问题。

For now, you should be able to use SerializableCoder rather than AvroCoder for the accumulator.

目前,您应该能够使用SerializableCoder而不是AvroCoder作为累加器。