Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Debug embedded microblaze using XVC JTAG in AWS FPGA shell #641

Open
augierg opened this issue Apr 30, 2024 · 12 comments
Open

Debug embedded microblaze using XVC JTAG in AWS FPGA shell #641

augierg opened this issue Apr 30, 2024 · 12 comments

Comments

@augierg
Copy link

augierg commented Apr 30, 2024

Is there any undocumented flow for embedded FW development using a MicroBlaze inside the CL, with the AWS-FPGA HDK?

In my on-premise environment, using u200 card, and following the instructions from aws-fpga-f1-u200/Virtual_JTAG_XVC.md, I am able to

  1. launch the XVC PCIe driver on the host with the U200 card
Description:
Xilinx xvc_pcie v2018.3
Build date : Apr 25 2024-12:18:59
Copyright 1986-2018 Xilinx, Inc. All Rights Reserved.

INFO: XVC PCIe Driver character file - /dev/xil_xvc/cfg_ioc0
INFO: XVC PCIe Driver configured to communicate with Debug Bridge IP in AXI mode (PCIe BAR space).
INFO: PCIe BAR index=0x0002 and PCIe BAR offset=0x0000
INFO: XVC PCIE Driver Loopback test successful.

INFO: xvc_pcie application started
INFO: Use Ctrl-C to exit xvc_pcie application

INFO: To connect to this xvc_pcie instance use url: TCP:fpga:10201

INFO: xvcserver accepted connection from client 192.168.8.104:43220 
  1. then connect to the MDM from XVC virtual cable, and control the MicroBlaze, in XSDB console
xsdb% connect -xvc-url tcp:fpga:10201                                                                                                                                               
tcfchan#1
xsdb% targets                                                                                                                                                                       
  1  debug_bridge
     2  Legacy Debug Hub
     3  Legacy Debug Hub
     4  MicroBlaze Debug Module at USER1.2.2
        5  MicroBlaze #0 (Running)
xsdb% jtag servers                                                                                                                                                                  
  digilent-ftdi cables 0
  xilinx-ftdi cables 0
  digilent-djtg cables 0
  bscan-jtag cables 0
  xilinx-xvc:fpga:10201 cables 1
xsdb% jtag targets                                                                                                                                                                  
  1  Xilinx Virtual Cable fpga:10201
     2  debug_bridge (idcode 0a003093 irlen 6 fpga)
        3  bscan-switch (idcode 04900102 irlen 1 fpga)
           4  debug-hub (idcode 04900220 irlen 1 fpga)
           5  bscan-switch (idcode 04900102 irlen 1 fpga)
              6  debug-hub (idcode 04900220 irlen 1 fpga)
              7  mdm (idcode 04900500 irlen 1 fpga)
xsdb% targets                                                                                                                                                                       
  1  debug_bridge
     2  Legacy Debug Hub
     3  Legacy Debug Hub
     4  MicroBlaze Debug Module at USER1.2.2
        5  MicroBlaze #0 (Running)
xsdb% target 5                                                                                                                                                                      
xsdb% targets                                                                                                                                                                       
  1  debug_bridge
     2  Legacy Debug Hub
     3  Legacy Debug Hub
     4  MicroBlaze Debug Module at USER1.2.2
        5* MicroBlaze #0 (Running)
xsdb% rst                                                                                                                                                                           
xsdb% Info: MicroBlaze #0 (target 5) Stopped at 0x0 (External debug request)

However, we porting the exact same CL to the AWS F1, using the instructions from aws-fpga/Virtual_JTAG_XVC.md

ubuntu@ip-172-31-41-160:~$ sudo /usr/local/bin/fpga-start-virtual-jtag -S 0
Starting Virtual JTAG XVC Server for FPGA slot id 0, listening to TCP port 10201.
Press CTRL-C to stop the service.

The xvc fails to identify the valid targets, including the MDM and microblaze, as shown below

xsdb% jtag servers         
  digilent-ftdi cables 0                                                                                                                                      
  xilinx-ftdi cables 0
  digilent-djtg cables 0
  bscan-jtag cables 0
  xilinx-xvc:ec2-52-34-30-133.us-west-2.compute.amazonaws.com:10201 cables 1
xsdb% jtag targets                                                                                                                                            
  8  Xilinx Virtual Cable ec2-52-34-30-133.us-west-2.compute.amazonaws.com:10201                                                                              
     9  debug_bridge (idcode 0a003093 irlen 6 fpga)
       10  bscan-switch (idcode 04900102 irlen 1 fpga)
          11  unknown (idcode 09200204 irlen 1 fpga)
          12  unknown (idcode 09200440 irlen 1 fpga)
xsdb% targets                                                                                                                                                 
  1  debug_bridge                                                                                                                                             
     2  09200204
     3  09200440
xsdb%                  

I even tried using the xilinx xvc driver, similar to the on-premise u200 flow, but that leads to errors indicating incompatibility with the PCIe BAR space

ubuntu@ip-172-31-41-160:~$ sudo /home/ubuntu/xvc/xvcserver/bin/xvc_pcie -s TCP::10201

Description:
Xilinx xvc_pcie v2018.3
Build date : Apr 25 2024-12:18:59
Copyright 1986-2018 Xilinx, Inc. All Rights Reserved.

INFO: XVC PCIe Driver character file - /dev/xil_xvc/cfg_ioc0
INFO: XVC PCIe Driver configured to communicate with Debug Bridge IP in AXI mode (PCIe BAR space).
INFO: PCIe BAR index=0x0002 and PCIe BAR offset=0x0000
Loopback test length: 32, pattern abcdefgHIJKLMOP FAILURE
	Byte 0 did not match (0x61 != 0x01 mask 0xFF), pattern abcdefgHIJKLMOP
ERROR: XVC PCIE Driver Loopback test failed. Error: Success
Exiting xvc_pcie application.

Help on the suggested flow to debug embedded FW using the XVC virtual JTAG cable on the F1 instance would be appreciated at this point

@czfpga
Copy link
Contributor

czfpga commented May 6, 2024

Hi,

Thank you for reaching out. We're currently investigating this issue with AMD. We'll keep you updated.

@s03311251
Copy link

I hope this is the right place to post here, as I am expericing a similar problem:

Right now I'm working to put a MicroBlaze core with Vivado IP Integrator flow, however, I can't connect to the MicroBlaze Debug Module (MDM) when I deployed in on an EC2 F1 instance.

My design is as following:
cl
design files: aws_mb_example.zip

I have followed #507, which mentioned to use "EXTERNAL HIDDEN" for BSCAN in MDM:
MDM settings

However, it seems failed to identify MDM, as when I connect it with XSCT, I got the following:
2024-05-16 12_33_24-Window
Top left is the SSH session of EC2 F1, which was running hw_server and sudo fpga-start-virtual-jtag -P 10201 -S 0
Bottom left is the Vivado tcl shell, connected to the Virtual JTAG on F1
Bottom right is the XSCT, also connected to the Virtual JTAG on F1
The commands are run in this order: SSH > Vivado > XSCT

The targets command in XSCT gives 8-digit numbers only, but supposedly it should gives something like "MicroBlaze Debug Module".

May I ask if there is any problem with my design with a MicroBlaze? Is there an alternative method to connect to MDM, or is there an example design for MicroBlaze that worked with AWS EC2?

Thank you.

@jameslxilinx
Copy link

I imagine the first use case is using HDK flow and the second is the HLx flow. In both cases we need to make sure the debug_hub and MDM is connected properly.

Doing a first pass with 2021.2 with HLx flow with MDM/MicroBlaze similar to the testcase, I noticed that the connections to MDM were not correct. Can the post opt of both designs be open and can the following connections be verified(first snap shot)? If you can either post the DCPs or snapshots of the post opt connection like below (note no connects make sense for the below and the debug icon is for an ILA test I was doing).

image

Below make sure the debug bridge connections go to the shell (second snapshot).

image

@augierg
Copy link
Author

augierg commented May 24, 2024

@jameslxilinx

In response to your inquiry, I confirm that for my case (first one), the mdm1 in the CL is connected to the boundary scan MUX locate in the STATIC_SH logic, per attached snapshot

image

@jameslxilinx
Copy link

@augierg , the above looks like the U200 shell. I would expect the name to say hidden or something like the below. Can you confirm this is the F1 design?

static_sh/SH_DEBUG_BRIDGE/inst/bsip/inst/USE_SOFTBSCAN.U_BSCAN_TAP

@augierg
Copy link
Author

augierg commented May 28, 2024

@jameslxilinx you are correct, this was generated from our on prem aws-f1-u200 flow. I'm going to provide you the similar for the actual F1 shell

@augierg
Copy link
Author

augierg commented May 29, 2024

@jameslxilinx I was able to screen capture. It wasn't really obvious, as everything in the hierarchy shows as hidden, until you descend in the right tree where the CL instance is located

image

@jameslxilinx
Copy link

While I look at the diagram, looks like lab tools is 2018.3. What version of the developer kit (Developer AMI) and version of vivado tools are being used?

@augierg
Copy link
Author

augierg commented May 30, 2024

While I look at the diagram, looks like lab tools is 2018.3. What version of the developer kit (Developer AMI) and version of vivado tools are being used?

2021.2, most recent supported version in current master, and head of master branch in repo for the HDK

The shell version downloaded in hdk/common is shell_v04261818

@augierg
Copy link
Author

augierg commented Jun 10, 2024

@jameslxilinx : is there anything else I can provide to help with this ?

@jameslxilinx
Copy link

Working on reproducing the issue in F1.

@augierg
Copy link
Author

augierg commented Sep 13, 2024

@jameslxilinx : any luck on reproducing this issue at your end ?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants