svn commit: r352441 - stable/12/sys/arm64/arm64
Andrew Turner
andrew at FreeBSD.org
Tue Sep 17 10:10:00 UTC 2019
Author: andrew
Date: Tue Sep 17 10:09:59 2019
New Revision: 352441
URL: https://svnweb.freebsd.org/changeset/base/352441
Log:
MFC r343042, r343875
r343042:
Ensure the I-Cache is correctly handled in arm64_icache_sync_range
The cache_handle_range macro to handle the arm64 instruction and data
cache operations would return when it was complete. This causes problems
for arm64_icache_sync_range and arm64_icache_sync_range_checked as they
assume they can execute the i-cache handling instruction after it has been
called.
Fix this by making this assumption correct.
While here add missing instruction barriers and adjust the style to
match the rest of the assembly.
Sponsored by: DARPA, AFRL
Differential Revision: https://reviews.freebsd.org/D18838
r343875:
Add a missing data barrier to the start of arm64_tlb_flushID.
We need to ensure the page table store has happened before the tlbi.
Reported by: jchandra
Tested by: jchandra
Sponsored by: DARPA, AFRL
Differential Revision: https://reviews.freebsd.org/D19097
Modified:
stable/12/sys/arm64/arm64/cpufunc_asm.S
Directory Properties:
stable/12/ (props changed)
Modified: stable/12/sys/arm64/arm64/cpufunc_asm.S
==============================================================================
--- stable/12/sys/arm64/arm64/cpufunc_asm.S Tue Sep 17 10:00:53 2019 (r352440)
+++ stable/12/sys/arm64/arm64/cpufunc_asm.S Tue Sep 17 10:09:59 2019 (r352441)
@@ -73,7 +73,6 @@ __FBSDID("$FreeBSD$");
.if \ic != 0
isb
.endif
- ret
.endm
ENTRY(arm64_nullop)
@@ -93,6 +92,7 @@ ENTRY(arm64_setttb)
END(arm64_setttb)
ENTRY(arm64_tlb_flushID)
+ dsb ishst
#ifdef SMP
tlbi vmalle1is
#else
@@ -108,6 +108,7 @@ END(arm64_tlb_flushID)
*/
ENTRY(arm64_dcache_wb_range)
cache_handle_range dcop = cvac
+ ret
END(arm64_dcache_wb_range)
/*
@@ -115,6 +116,7 @@ END(arm64_dcache_wb_range)
*/
ENTRY(arm64_dcache_wbinv_range)
cache_handle_range dcop = civac
+ ret
END(arm64_dcache_wbinv_range)
/*
@@ -125,6 +127,7 @@ END(arm64_dcache_wbinv_range)
*/
ENTRY(arm64_dcache_inv_range)
cache_handle_range dcop = ivac
+ ret
END(arm64_dcache_inv_range)
/*
@@ -132,6 +135,7 @@ END(arm64_dcache_inv_range)
*/
ENTRY(arm64_idcache_wbinv_range)
cache_handle_range dcop = civac, ic = 1, icop = ivau
+ ret
END(arm64_idcache_wbinv_range)
/*
@@ -146,4 +150,6 @@ ENTRY(arm64_icache_sync_range)
cache_handle_range dcop = cvau
ic ialluis
dsb ish
+ isb
+ ret
END(arm64_icache_sync_range)
More information about the svn-src-stable
mailing list