问答 求助:使用 appcrawler,老是报错,报错信息:javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’,要怎么解决

vivi0407 · 2020年04月22日 · 1217 次阅读

报错信息
AutomationSuite:
2020-04-22 00:15:56 INFO [AutomationSuite.13.beforeAll] beforeAll
2020-04-22 00:15:56 INFO [AutomationSuite.21.$anonfun$new$1] testcase start
2020-04-22 00:15:56 INFO [AutomationSuite.28.$anonfun$new$2] Step(null,null,null,跳过,click,null,0)
2020-04-22 00:15:56 INFO [AutomationSuite.31.$anonfun$new$2] 跳过
2020-04-22 00:15:56 INFO [AutomationSuite.32.$anonfun$new$2] click
2020-04-22 00:15:56 INFO [Crawler.996.doElementAction] current element = Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
2020-04-22 00:15:56 INFO [Crawler.997.doElementAction] current index = 1
2020-04-22 00:15:56 INFO [Crawler.998.doElementAction] current action = click
2020-04-22 00:15:56 INFO [Crawler.999.doElementAction] current xpath = //[@resource-id="com.ykkg.lz:id/action_bar_root"]//[@resource-id="android:id/content"]//[@resource-id="com.ykkg.lz:id/rl_root"]//[@text="跳过" and @resource-id="com.ykkg.lz:id/tv_jump"]
2020-04-22 00:15:56 INFO [Crawler.1000.doElementAction] current url = Steps
2020-04-22 00:15:56 INFO [Crawler.1001.doElementAction] current tag path = hierarchy/android.widget.FrameLayout/android.widget.LinearLayout/android.widget.FrameLayout/android.widget.LinearLayout/android.widget.FrameLayout/android.widget.RelativeLayout/android.widget.TextView
2020-04-22 00:15:56 INFO [Crawler.1002.doElementAction] current file name = Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
2020-04-22 00:15:56 INFO [AppCrawler$.59.saveReqHash] save reqHash to 1
2020-04-22 00:15:56 INFO [AppCrawler$.92.saveReqImg] save reqImg 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png to 1
2020-04-22 00:15:56 INFO [AppCrawler$.76.saveReqDom] save reqDom to 1
2020-04-22 00:15:56 INFO [Crawler.1071.doElementAction] need input click
2020-04-22 00:15:56 INFO [AppiumClient.53.findElementByURI] find by uri element= Steps.tag=TextView.depth=8.id=tv_jump.text=跳过
2020-04-22 00:15:56 INFO [AppiumClient.245.findElementsByURI] findElementByAndroidUIAutomator new UiSelector().className("android.widget.TextView").text("跳过").resourceId("com.ykkg.lz:id/tv_jump")
2020-04-22 00:15:56 INFO [AppiumClient.60.findElementByURI] find by xpath success
2020-04-22 00:15:56 INFO [Crawler.1080.doElementAction] mark 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png to 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png
2020-04-22 00:15:56 INFO [AppiumClient.141.mark] read from 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png
2020-04-22 00:15:59 INFO [AppiumClient.154.mark] write png 20200422001533/0_SplashActiveActivity.tag=start.id=start.clicked.png
2020-04-22 00:15:59 INFO [AppiumClient.161.mark] ImageIO.write newImageName 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.click.png
2020-04-22 00:16:00 INFO [Crawler.1095.$anonfun$doElementAction$5] click element
2020-04-22 00:16:00 INFO [AppiumClient.174.click] [[io.appium.java_client.android.AndroidDriver, Capabilities: {app=, appActivity=com.dx168.efsmobile.application.SplashActivity, appPackage=com.ykkg.lz, appium=http://127.0.0.1:4723/wd/hub, databaseEnabled=false, desired={platformName=android, appium=http://127.0.0.1:4723/wd/hub, app=, appActivity=com.dx168.efsmobile.application.SplashActivity, appPackage=com.ykkg.lz, deviceName=demo, fullReset=false, noReset=true}, deviceApiLevel=22, deviceManufacturer=vivo, deviceModel=vivo X7, deviceName=db20dbf7, deviceScreenDensity=480, deviceScreenSize=1080x1920, deviceUDID=db20dbf7, fullReset=false, javascriptEnabled=true, locationContextEnabled=false, networkConnectionEnabled=true, noReset=true, pixelRatio=3, platform=LINUX, platformName=Android, platformVersion=5.1.1, statBarHeight=72, takesScreenshot=true, viewportRect={left=0, top=72, width=1080, height=1848}, warnings={}, webStorageEnabled=false}] -> -android uiautomator: new UiSelector().className("android.widget.TextView").text("跳过").resourceId("com.ykkg.lz:id/tv_jump")]
2020-04-22 00:16:02 INFO [Crawler.1126.doElementAction] mark image exist
2020-04-22 00:16:02 INFO [Crawler.1130.doElementAction] sleep 500 for loading
2020-04-22 00:16:03 INFO [Crawler.627.refreshPage] refresh page
2020-04-22 00:16:03 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
2020-04-22 00:16:03 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
2020-04-22 00:16:03 INFO [Crawler.645.parsePageContext] appName =
2020-04-22 00:16:03 INFO [Crawler.649.parsePageContext] url=MainActivity
2020-04-22 00:16:04 INFO [Crawler.673.parsePageContext] currentContentHash=150a5c764690a3d0af3e1fffdab3c011 lastContentHash=d10e6132ee8d3b2bd7fa6854da178699
2020-04-22 00:16:04 INFO [Crawler.675.parsePageContext] ui change
2020-04-22 00:16:04 INFO [Crawler.931.saveDom] save to 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.dom
2020-04-22 00:16:04 INFO [Crawler.953.saveScreen] start screenshot
2020-04-22 00:16:04 INFO [Crawler.956.$anonfun$saveScreen$2] ui change screenshot again
2020-04-22 00:16:05 INFO [Crawler.977.saveScreen] screenshot success
2020-04-22 00:16:05 INFO [AppCrawler$.67.saveResHash] save resHash to 1
2020-04-22 00:16:05 INFO [AppCrawler$.101.saveResImg] save resImg 20200422001533/1_Steps.tag=TextView.depth=8.id=tv_jump.text=跳过.clicked.png to 1
2020-04-22 00:16:05 INFO [AppCrawler$.84.saveResDom] save resDom to 1
2020-04-22 00:16:05 INFO [AutomationSuite.66.$anonfun$new$1] finish run steps

  • run steps 2020-04-22 00:16:05 INFO [Crawler.627.refreshPage] refresh page 2020-04-22 00:16:05 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium 2020-04-22 00:16:06 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format 2020-04-22 00:16:06 INFO [Crawler.645.parsePageContext] appName = 2020-04-22 00:16:06 INFO [Crawler.649.parsePageContext] url=MainActivity 2020-04-22 00:16:06 INFO [Crawler.673.parsePageContext] currentContentHash=3d9930cd9d5328c64b4fef63ed06d2ba lastContentHash=150a5c764690a3d0af3e1fffdab3c011 2020-04-22 00:16:06 INFO [Crawler.675.parsePageContext] ui change 2020-04-22 00:16:06 INFO [Crawler.1213.handleCtrlC] add shutdown hook 2020-04-22 00:16:06 INFO [Crawler.772.crawl]

crawl next
2020-04-22 00:16:06 INFO [Crawler.425.needReturn] urlStack=Stack(MainActivity) baseUrl=List() maxDepth=10
2020-04-22 00:16:06 INFO [Crawler.834.crawl] no need to back
2020-04-22 00:16:06 INFO [Crawler.487.getAvailableElement] selected nodes size = 9
2020-04-22 00:16:06 ERROR [Crawler.193.crawl] crawl not finish, return with exception
2020-04-22 00:16:06 ERROR [Crawler.194.crawl] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
2020-04-22 00:16:06 ERROR [Crawler.195.crawl] TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
2020-04-22 00:16:06 ERROR [Crawler.196.crawl] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] [wrapped] javax.xml.xpath.XPathExpressionException: javax.xml.transform.TransformerException: 需要位置路径, 但遇到以下标记: ‘行情’
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.error(XPathParser.java:612)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.LocationPath(XPathParser.java:1603)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PathExpr(XPathParser.java:1319)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnionExpr(XPathParser.java:1238)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnaryExpr(XPathParser.java:1144)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.MultiplicativeExpr(XPathParser.java:1065)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AdditiveExpr(XPathParser.java:1007)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelationalExpr(XPathParser.java:932)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:872)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:896)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:836)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:842)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.OrExpr(XPathParser.java:809)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Expr(XPathParser.java:792)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PredicateExpr(XPathParser.java:1956)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Predicate(XPathParser.java:1938)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Step(XPathParser.java:1728)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelativeLocationPath(XPathParser.java:1628)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.LocationPath(XPathParser.java:1599)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.PathExpr(XPathParser.java:1319)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnionExpr(XPathParser.java:1238)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.UnaryExpr(XPathParser.java:1144)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.MultiplicativeExpr(XPathParser.java:1065)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AdditiveExpr(XPathParser.java:1007)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.RelationalExpr(XPathParser.java:932)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.EqualityExpr(XPathParser.java:872)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.AndExpr(XPathParser.java:836)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.OrExpr(XPathParser.java:809)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.Expr(XPathParser.java:792)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.compiler.XPathParser.initXPath(XPathParser.java:131)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.XPath.(XPath.java:180)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.XPath.(XPath.java:268)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.sun.org.apache.xpath.internal.jaxp.XPathImpl.compile(XPathImpl.java:390)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListFromXML(XPathUtil.scala:167)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListFromXPath(XPathUtil.scala:183)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.XPathUtil$.getNodeListByKey(XPathUtil.scala:271)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$getAvailableElement$4(Crawler.scala:493)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$getAvailableElement$4$adapted(Crawler.scala:491)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.immutable.List.foreach(List.scala:389)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.generic.TraversableForwarder.foreach(TraversableForwarder.scala:35)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.generic.TraversableForwarder.foreach$(TraversableForwarder.scala:35)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.collection.mutable.ListBuffer.foreach(ListBuffer.scala:44)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.getAvailableElement(Crawler.scala:491)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.crawl(Crawler.scala:840)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$crawl$1(Crawler.scala:187)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at scala.util.Try$.apply(Try.scala:209)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.crawl(Crawler.scala:187)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.start(Crawler.scala:170)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.startCrawl(AppCrawler.scala:322)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.parseParams(AppCrawler.scala:290)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.main(AppCrawler.scala:91)
2020-04-22 00:16:06 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler.main(AppCrawler.scala)
2020-04-22 00:16:06 ERROR [Crawler.198.crawl] create new session
2020-04-22 00:16:06 INFO [Crawler.214.restart] execute shell on restart
2020-04-22 00:16:06 INFO [Crawler.217.restart] restart appium
2020-04-22 00:16:06 INFO [Crawler.250.setupAppium] afterPageMax=2
2020-04-22 00:16:06 INFO [Crawler.273.setupAppium] use AppiumClient

我确实把 “行情” 加入到 blacklist 里面去了,不知道是不是这个原因

暂无回复。
需要 登录 后方可回复, 如果你还没有账号请点击这里 注册