java -jar ../COMMON/appcrawler-2.4.0.jar -u http://localhost:4623/wd/hub -p android -o androidlog/ -c ../COMMON/configandroid.yml --capability appPackage=com.jingdong.app.mall,appActivity=.main.MainActivity
[ appcrawler tranverse app ]: check appcrawler, for 0 s
AppCrawler 2.4.0 [霍格沃兹测试学院特别纪念版]
2019-10-17 14:52:49 INFO [AppCrawler$.203.parseParams] Find Conf /st/exe-auto/AutoProtocol/nfq/../COMMON/configandroid.yml
2019-10-17 14:52:49 INFO [AppCrawler$.230.parseParams] appium address = Some(http://localhost:4623/wd/hub)
2019-10-17 14:52:49 INFO [AppCrawler$.242.parseParams] result directory = androidlog/
2019-10-17 14:52:50 INFO [Crawler.130.start] set xpath attribute with List(name, label, value, resource-id, content-desc, instance, text)
2019-10-17 14:52:50 INFO [Crawler.135.start] set xpath
2019-10-17 14:52:50 INFO [Crawler.89.$anonfun$loadPlugins$2] com.testerhome.appcrawler.plugin.TagLimitPlugin@2fcd7d3f
2019-10-17 14:52:50 INFO [Crawler.89.$anonfun$loadPlugins$2] com.testerhome.appcrawler.plugin.ReportPlugin@6ae62c7e
2019-10-17 14:52:50 INFO [Crawler.89.$anonfun$loadPlugins$2] com.testerhome.appcrawler.plugin.FreeMind@37c36608
2019-10-17 14:52:50 INFO [TagLimitPlugin.19.init] com.testerhome.appcrawler.plugin.TagLimitPlugin init
2019-10-17 14:52:50 INFO [ReportPlugin.19.init] com.testerhome.appcrawler.plugin.ReportPlugin init
2019-10-17 14:52:50 INFO [FreeMind.19.init] com.testerhome.appcrawler.plugin.FreeMind init
2019-10-17 14:52:50 INFO [ReportPlugin.21.start] reportPath=/st/exe-auto/AutoProtocol/nfq/androidlog
2019-10-17 14:52:50 INFO [ReportPlugin.24.start] create /st/exe-auto/AutoProtocol/nfq/androidlog/tmp/ directory
2019-10-17 14:52:50 INFO [Crawler.138.start] prepare setup Appium
2019-10-17 14:52:50 INFO [Crawler.250.setupAppium] afterPageMax=2
2019-10-17 14:52:50 INFO [Crawler.273.setupAppium] use AppiumClient
2019-10-17 14:52:50 INFO [Crawler.274.setupAppium] Map(newCommandTimeout -> 120, appActivity -> .main.MainActivity, launchTimeout -> 120000, appium -> http://localhost:4623/wd/hub, noReset -> true, dontStopAppOnReset -> true, appPackage -> com.jingdong.app.mall)
十月 17, 2019 2:53:04 下午 io.appium.java_client.remote.AppiumCommandExecutor$1 lambda$0
信息: Detected dialect: W3C
2019-10-17 14:53:04 INFO [AppiumClient.120.getDeviceInfo] screenWidth=1080 screenHeight=1920
2019-10-17 14:53:04 INFO [AppiumClient.112.appium] capture dir = /st/exe-auto/AutoProtocol/nfq/.
2019-10-17 14:53:04 INFO [Crawler.278.setupAppium] com.testerhome.appcrawler.driver.AppiumClient@5bda157e
2019-10-17 14:53:04 INFO [Crawler.145.start] platformName= driver=com.testerhome.appcrawler.driver.AppiumClient@5bda157e
AppCrawler 2.4.0 [霍格沃兹测试学院特别纪念版]
2019-10-17 14:53:04 INFO [Crawler.147.start] waiting for app load
2019-10-17 14:53:10 INFO [Crawler.149.start] driver=null
2019-10-17 14:53:10 INFO [Crawler.150.start] get screen info
2019-10-17 14:53:10 INFO [AppiumClient.120.getDeviceInfo] screenWidth=1080 screenHeight=1920
2019-10-17 14:53:10 INFO [Crawler.627.refreshPage] refresh page
2019-10-17 14:53:10 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
2019-10-17 14:53:12 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
2019-10-17 14:53:13 INFO [Crawler.645.parsePageContext] appName =
2019-10-17 14:53:13 INFO [Crawler.649.parsePageContext] url=MainFrameActivity
2019-10-17 14:53:13 INFO [Crawler.673.parsePageContext] currentContentHash=c939964d57a4b3f5e2578556b7c7e1d1 lastContentHash=c939964d57a4b3f5e2578556b7c7e1d1
2019-10-17 14:53:13 INFO [DataRecord.24.isDiff] just only record return false
2019-10-17 14:53:13 INFO [Crawler.677.parsePageContext] ui not change
2019-10-17 14:53:13 INFO [Crawler.230.firstRefresh] first refresh
2019-10-17 14:53:13 INFO [Crawler.996.doElementAction] current element = MainFrameActivity.tag=start.id=start
2019-10-17 14:53:13 INFO [Crawler.997.doElementAction] current index = 0
2019-10-17 14:53:13 INFO [Crawler.998.doElementAction] current action =
2019-10-17 14:53:13 INFO [Crawler.999.doElementAction] current xpath = Start-Start-0
2019-10-17 14:53:13 INFO [Crawler.1000.doElementAction] current url = MainFrameActivity
2019-10-17 14:53:13 INFO [Crawler.1001.doElementAction] current tag path =
2019-10-17 14:53:13 INFO [Crawler.1002.doElementAction] current file name = MainFrameActivity.tag=start.id=start
2019-10-17 14:53:13 INFO [AppCrawler$.59.saveReqHash] save reqHash to 0
2019-10-17 14:53:13 INFO [AppCrawler$.92.saveReqImg] save reqImg androidlog//0_MainFrameActivity.tag=start.id=start.click.png to 0
2019-10-17 14:53:13 INFO [AppCrawler$.76.saveReqDom] save reqDom to 0
2019-10-17 14:53:13 INFO [Crawler.1014.doElementAction] just log
2019-10-17 14:53:13 INFO [Crawler.1015.doElementAction] {
"url" : "MainFrameActivity",
"tag" : "start",
"id" : "start",
"name" : "",
"text" : "",
"instance" : "",
"depth" : "",
"valid" : "true",
"selected" : "false",
"xpath" : "Start-Start-0",
"ancestor" : "",
"x" : 0,
"y" : 0,
"width" : 0,
"height" : 0
}
2019-10-17 14:53:13 INFO [Crawler.1126.doElementAction] mark image exist
2019-10-17 14:53:13 INFO [Crawler.1130.doElementAction] sleep 500 for loading
2019-10-17 14:53:13 INFO [Crawler.627.refreshPage] refresh page
2019-10-17 14:53:13 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
2019-10-17 14:53:25 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
2019-10-17 14:53:25 INFO [Crawler.645.parsePageContext] appName =
2019-10-17 14:53:25 INFO [Crawler.649.parsePageContext] url=MainFrameActivity
2019-10-17 14:53:25 INFO [Crawler.673.parsePageContext] currentContentHash=b8f9645a95f33f35014a4ab3ab0e2fb1 lastContentHash=c939964d57a4b3f5e2578556b7c7e1d1
2019-10-17 14:53:25 INFO [Crawler.675.parsePageContext] ui change
2019-10-17 14:53:25 INFO [Crawler.931.saveDom] save to androidlog//0_MainFrameActivity.tag=start.id=start.dom
2019-10-17 14:53:25 INFO [Crawler.953.saveScreen] start screenshot
2019-10-17 14:53:25 INFO [Crawler.956.$anonfun$saveScreen$2] ui change screenshot again
2019-10-17 14:53:27 INFO [Crawler.977.saveScreen] screenshot success
2019-10-17 14:53:27 INFO [AppCrawler$.67.saveResHash] save resHash to 0
2019-10-17 14:53:27 INFO [AppCrawler$.101.saveResImg] save resImg androidlog//0_MainFrameActivity.tag=start.id=start.clicked.png to 0
2019-10-17 14:53:27 INFO [AppCrawler$.84.saveResDom] save resDom to 0
2019-10-17 14:53:27 INFO [Crawler.159.start] append current app name to appWhiteList
2019-10-17 14:53:27 INFO [Crawler.163.start] run steps
2019-10-17 14:53:27 INFO [Crawler.237.runSteps] run testcases
AutomationSuite:
2019-10-17 14:53:27 INFO [AutomationSuite.13.beforeAll] beforeAll
2019-10-17 14:53:27 INFO [AutomationSuite.21.$anonfun$new$1] testcase start
2019-10-17 14:53:27 INFO [AutomationSuite.28.$anonfun$new$2] Step(List(),null,List(),/,Thread.sleep(5000),List(),0)
2019-10-17 14:53:27 INFO [AutomationSuite.31.$anonfun$new$2] /
2019-10-17 14:53:27 INFO [AutomationSuite.32.$anonfun$new$2] Thread.sleep(5000)
2019-10-17 14:53:27 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
2019-10-17 14:53:39 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
2019-10-17 14:53:39 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
[ appcrawler tranverse app ]: appcrawler is still running
[ appcrawler tranverse app ]: check appcrawler, for 60 s
2019-10-17 14:53:50 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
2019-10-17 14:53:50 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
2019-10-17 14:54:02 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
2019-10-17 14:54:02 INFO [Crawler.996.doElementAction] current element = Steps.tag=.name=NOT_FOUND
2019-10-17 14:54:02 INFO [Crawler.997.doElementAction] current index = 1
2019-10-17 14:54:02 INFO [Crawler.998.doElementAction] current action =
2019-10-17 14:54:02 INFO [Crawler.999.doElementAction] current xpath = /*
2019-10-17 14:54:02 INFO [Crawler.1000.doElementAction] current url = Steps
2019-10-17 14:54:02 INFO [Crawler.1001.doElementAction] current tag path =
2019-10-17 14:54:02 INFO [Crawler.1002.doElementAction] current file name = Steps.tag=.name=NOT_FOUND
2019-10-17 14:54:02 INFO [AppCrawler$.59.saveReqHash] save reqHash to 1
2019-10-17 14:54:02 INFO [AppCrawler$.92.saveReqImg] save reqImg androidlog//1_Steps.tag=.name=NOT_FOUND.click.png to 1
2019-10-17 14:54:02 INFO [AppCrawler$.76.saveReqDom] save reqDom to 1
2019-10-17 14:54:02 INFO [Crawler.1014.doElementAction] just log
2019-10-17 14:54:02 INFO [Crawler.1015.doElementAction] {
"url" : "Steps",
"tag" : "",
"id" : "",
"name" : "NOT_FOUND",
"text" : "",
"instance" : "",
"depth" : "",
"valid" : "true",
"selected" : "false",
"xpath" : "/",
"ancestor" : "",
"x" : 0,
"y" : 0,
"width" : 0,
"height" : 0
}
2019-10-17 14:54:02 INFO [Crawler.1123.doElementAction] use last clicked image replace mark
2019-10-17 14:54:02 INFO [Crawler.1130.doElementAction] sleep 500 for loading
2019-10-17 14:54:02 INFO [Crawler.627.refreshPage] refresh page
2019-10-17 14:54:02 INFO [AppiumClient.102.getPageSourceWithRetry] start to get page source from appium
2019-10-17 14:54:14 INFO [AppiumClient.117.$anonfun$getPageSourceWithRetry$1] xml format
2019-10-17 14:54:14 INFO [Crawler.645.parsePageContext] appName =
2019-10-17 14:54:14 INFO [Crawler.649.parsePageContext] url=MainFrameActivity
2019-10-17 14:54:14 INFO [Crawler.673.parsePageContext] currentContentHash=ea6310c34dfcd586baa5bd6664beba77 lastContentHash=b8f9645a95f33f35014a4ab3ab0e2fb1
2019-10-17 14:54:14 INFO [Crawler.675.parsePageContext] ui change
2019-10-17 14:54:14 INFO [Crawler.931.saveDom] save to androidlog//1_Steps.tag=.name=NOT_FOUND.dom
2019-10-17 14:54:14 INFO [Crawler.953.saveScreen] start screenshot
2019-10-17 14:54:14 INFO [Crawler.956.$anonfun$saveScreen$2] ui change screenshot again
2019-10-17 14:54:16 INFO [Crawler.977.saveScreen] screenshot success
2019-10-17 14:54:16 INFO [AppCrawler$.67.saveResHash] save resHash to 1
2019-10-17 14:54:16 INFO [AppCrawler$.101.saveResImg] save resImg androidlog//1_Steps.tag=.name=NOT_FOUND.clicked.png to 1
2019-10-17 14:54:16 INFO [AppCrawler$.84.saveResDom] save resDom to 1
2019-10-17 14:54:16 INFO [AutomationSuite.44.$anonfun$new$3] /
crawl next
2019-10-17 14:54:27 INFO [Crawler.425.needReturn] urlStack=Stack(MainFrameActivity) baseUrl=List() maxDepth=5
2019-10-17 14:54:27 INFO [Crawler.834.crawl] no need to back
2019-10-17 14:54:27 INFO [Crawler.487.getAvailableElement] selected nodes size = 23
2019-10-17 14:54:27 ERROR [Crawler.193.crawl] crawl not finish, return with exception
2019-10-17 14:54:27 ERROR [Crawler.194.crawl]
2019-10-17 14:54:27 ERROR [Crawler.195.crawl] NullPointerException:
2019-10-17 14:54:27 ERROR [Crawler.196.crawl]
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] java.lang.NullPointerException
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.getAvailableElement(Crawler.scala:491)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.crawl(Crawler.scala:840)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.$anonfun$crawl$1(Crawler.scala:187)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:12)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at scala.util.Try$.apply(Try.scala:209)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.crawl(Crawler.scala:187)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.Crawler.start(Crawler.scala:170)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.startCrawl(AppCrawler.scala:322)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.parseParams(AppCrawler.scala:290)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler$.main(AppCrawler.scala:91)
2019-10-17 14:54:27 ERROR [Crawler.197.$anonfun$crawl$2] at com.testerhome.appcrawler.AppCrawler.main(AppCrawler.scala)
2019-10-17 14:54:27 ERROR [Crawler.198.crawl] create new session
2019-10-17 14:54:27 INFO [Crawler.214.restart] execute shell on restart
2019-10-17 14:54:27 INFO [Crawler.217.restart] restart appium