Pandas Randomly Data Choosing

I’m a beginner in Pandas. I have a data file containing 10000 different information of users. This data contain 5 columns and 10000 rows. One of these columns is the district of the users and it divides users according to their living place(It defines just 7 different locations and in each of locations some number of users live). as an example, out of this 10000 users, 300 users live in USA and 250 Live in Canada and..
I want to define a DataFrame which includes five random rows of users with the distinct of: USA,Canada,LA,NY and Japan. Also, the dimensions needs to be 20*5. Can you please help me how to do that?
I know for choosing random I need to use

Dataframe column: to find (cumulative) local maxima

In the below dataframe the column “CumRetperTrade” is a column which consists of a few vertical vectors (=sequences of numbers) separated by zeros. (= these vectors correspond to non-zero elements of column “Portfolio”). I would like to find the cumulative local maxima of every non-zero vector contained in column “CumRetperTrade”.
To be precise, I would like to transform (using vectorization – or other – methods) column “CumRetperTrade” to the column “PeakCumRet” (desired result) which gives for every vector ( = subset corresponding to ’Portfolio =1 ’) contained in column “CumRetperTrade” the cumulative maximum value of (all its previous) values. The numeric example is below. Thanks in advance!
PS In other words, I guess that we need to use cummax() but to apply it only to the consequent (where ‘Portfolio’ = 1) subsets of ‘CumRetperTrade’

Looping through Dask array made of npy memmap files increases RAM without ever freeing it

Context I am trying to load multiple .npy files containing 2D arrays into one big 2D array to process it by chunk later.All of this data is bigger than my RAM so I am using the memmap storage/loading system here: pattern = os.path.join(FROM_DIR, '*.npy') paths = sorted(glob.glob(pattern)) arrays = [np.load(path, mmap_mode='r') for path in paths] … Read more