排查sqoop报错:Error running child : java.lang.OutOfMemoryError: Java heap space

时间:2022-05-25 12:25:15

报错栈:

-- ::, INFO [main] org.apache.hadoop.mapred.MapTask: Processing split: = AND =
-- ::, INFO [main] org.apache.sqoop.mapreduce.db.DBRecordReader: Working on split: = AND =
-- ::, INFO [main] org.apache.sqoop.mapreduce.db.DBRecordReader: Executing query: select "EXTEND3","EXTEND2","EXTEND1","MEMO","OPER_DATE","OPER_CODE","FILE_CONTENT","FILE_NAME","INPATIENT_NO","ID" from HIS_SDZL."MDT_FILE" tbl where ( = ) AND ( = )
-- ::, INFO [Thread-] org.apache.sqoop.mapreduce.AutoProgressMapper: Auto-progress thread is finished. keepGoing=false
-- ::, FATAL [main] org.apache.hadoop.mapred.YarnChild: Error running child : java.lang.OutOfMemoryError: Java heap space
at java.util.Arrays.copyOf(Arrays.java:)
at java.lang.AbstractStringBuilder.expandCapacity(AbstractStringBuilder.java:)
at java.lang.AbstractStringBuilder.ensureCapacityInternal(AbstractStringBuilder.java:)
at java.lang.AbstractStringBuilder.append(AbstractStringBuilder.java:)
at java.lang.StringBuffer.append(StringBuffer.java:)
at java.util.regex.Matcher.appendReplacement(Matcher.java:)
at java.util.regex.Matcher.replaceAll(Matcher.java:)
at java.lang.String.replaceAll(String.java:)
at QueryResult.readFields(QueryResult.java:)
at org.apache.sqoop.mapreduce.db.DBRecordReader.nextKeyValue(DBRecordReader.java:)
at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.nextKeyValue(MapTask.java:)
at org.apache.hadoop.mapreduce.task.MapContextImpl.nextKeyValue(MapContextImpl.java:)
at org.apache.hadoop.mapreduce.lib.map.WrappedMapper$Context.nextKeyValue(WrappedMapper.java:)
at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:)
at org.apache.sqoop.mapreduce.AutoProgressMapper.run(AutoProgressMapper.java:)
at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:)
at org.apache.hadoop.mapred.MapTask.run(MapTask.java:)
at org.apache.hadoop.mapred.YarnChild$.run(YarnChild.java:)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:)
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:)
at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:)

调小fetchsize参数也不能解决,那问题很可能是某行数据占用空间很大。根据Sqoop生成的导入表对应的实例化类QueryResult.java的244行可定位到报错列是FILE_CONTENT,是个二进制列, 然后查询原库,果然最大的列长达到180M:

排查sqoop报错:Error running child : java.lang.OutOfMemoryError: Java heap space

ps: 怎么用标准的sql语句查询 blob字段的大小?
blob字段有好多种。如果是9i的简单的blob字段则应该是 length,或者lengthb也可。实在不行可以用 dbms_lob.getlength()