forked from explosion/projects
-
Notifications
You must be signed in to change notification settings - Fork 1
/
Copy pathdocs_issues_training.jsonl
661 lines (661 loc) · 932 KB
/
docs_issues_training.jsonl
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
643
644
645
646
647
648
649
650
651
652
653
654
655
656
657
658
659
660
661
{"text":"add please","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Examples, failed to load qml","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"DMCHMM","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Moving from MySQL to Hybrid SQL","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Doc build broken\n\nI can reproduce the circleCI failures locally, the errors are:\r\n\r\n```\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:57: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:60: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:62: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:63: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:64: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:65: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:66: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:67: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:72: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:73: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:74: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:75: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:79: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:80: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:83: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:84: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:85: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:89: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:93: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:94: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:95: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:96: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:97: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:98: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:99: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:100: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:101: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:102: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:103: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:104: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:105: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_axes:106: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:75: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:78: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:80: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:81: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:82: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:83: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:84: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:85: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:90: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:91: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:92: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:93: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:97: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:98: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:101: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:102: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:103: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:107: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:111: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:112: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:113: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:114: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:115: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:116: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:117: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:118: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:119: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:120: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:121: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:122: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:123: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.add_subplot:124: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:15: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:18: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:20: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:21: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:22: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:23: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:24: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:25: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:30: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:31: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:32: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:33: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:37: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:38: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:41: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:42: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:43: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:47: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:51: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:52: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:53: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:54: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:55: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:56: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:57: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:58: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:59: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:60: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:61: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:62: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:63: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/figure.py:docstring of matplotlib.figure.Figure.gca:64: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:70: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:73: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:75: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:76: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:77: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:78: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:79: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:80: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:85: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:86: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:87: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:88: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:92: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:93: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:96: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:97: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:98: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:102: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:106: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:107: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:108: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:109: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:110: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:111: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:112: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:113: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:114: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:115: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:116: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:117: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:118: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.axes:119: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:71: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:74: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:76: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:77: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:78: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:79: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:80: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:81: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:86: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:87: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:88: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:89: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:93: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:94: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:97: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:98: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:99: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:103: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:107: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:108: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:109: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:110: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:111: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:112: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:113: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:114: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:115: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:116: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:117: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:118: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:119: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/pyplot.py:docstring of matplotlib.pyplot.subplot:120: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:80: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:83: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:85: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:86: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:87: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:88: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:89: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:90: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:95: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:96: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:97: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:98: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:102: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:103: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:106: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:107: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:108: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:112: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:116: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:117: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:118: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:119: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:120: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:121: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:122: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:123: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:124: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:125: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:126: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:127: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:128: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.inset_axes:129: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:67: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:70: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:72: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:73: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:74: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:75: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:76: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:77: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:82: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:83: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:84: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:85: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:89: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:90: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:93: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:94: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:95: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:99: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:103: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:104: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:105: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:106: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:107: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:108: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:109: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:110: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:111: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:112: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:113: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:114: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:115: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/inset_locator.py:docstring of mpl_toolkits.axes_grid1.inset_locator.zoomed_inset_axes:116: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:31: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:34: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:36: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:37: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:38: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:39: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:40: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:41: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:46: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:47: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:48: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:49: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:53: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:54: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:57: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:58: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:59: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:63: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:67: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:68: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:69: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:70: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:71: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:72: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:73: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:74: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:75: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:76: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:77: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:78: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:79: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/mpl_toolkits/axes_grid1/mpl_axes.py:docstring of mpl_toolkits.axes_grid1.mpl_axes.Axes:80: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:63: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_adjustable\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:66: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_anchor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:68: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_aspect\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:69: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscale_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:70: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscalex_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:71: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_autoscaley_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:72: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axes_locator\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:73: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_axisbelow\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:78: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:79: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_facecolor\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:80: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_figure\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:81: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_frame_on\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:85: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:86: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_navigate_mode\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:89: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_position\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:90: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_prop_cycle\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:91: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_rasterization_zorder\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:95: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_title\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:99: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xbound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:100: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_xlabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:101: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xlim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:102: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xmargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:103: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:104: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:105: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_xticks\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:106: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ybound\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:107: WARNING: py:meth reference target not found: matplotlib.axes._axes.Axes.set_ylabel\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:108: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ylim\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:109: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_ymargin\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:110: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yscale\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:111: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticklabels\r\n/Users/dstansby/github/matplotlib/lib/matplotlib/axes/_axes.py:docstring of matplotlib.axes.Axes:112: WARNING: py:meth reference target not found: matplotlib.axes._base._AxesBase.set_yticks\r\n/Users/dstansby/github/matplotlib/doc/api/prev_api_changes/api_changes_1.3.x.rst:97: WARNING: py:attr reference target not found: matplotlib.colorbar.ColorbarBase.ax\r\n/Users/dstansby/github/matplotlib/doc/gallery/lines_bars_and_markers/gradient_bar.rst:39: WARNING: Could not lex literal_block as \"python\". Highlighting skipped.\r\n/Users/dstansby/github/matplotlib/doc/tutorials/colors/colormap-manipulation.rst:93: WARNING: py:obj reference target not found: ListedColormap\r\n/Users/dstansby/github/matplotlib/doc/tutorials/colors/colormap-manipulation.rst:482: WARNING: py:obj reference target not found: LinearSegmentedColormap.from_list\r\n/Users/dstansby/github/matplotlib/doc/tutorials/introductory/usage.rst:602: WARNING: py:obj reference target not found: name.of.the.backend\r\n/Users/dstansby/github/matplotlib/doc/tutorials/introductory/usage.rst:602: WARNING: py:obj reference target not found: module://name.of.the.backend\r\n/Users/dstansby/github/matplotlib/doc/tutorials/introductory/usage.rst:602: WARNING: py:obj reference target not found: matplotlib.use('module://name.of.the.backend')\r\ngenerating indices... genindex py-modindex\r\nhighlighting module code... [100%] numpy \r\nwriting additional pages... search opensearch\r\ncopying images... [100%] users/../build/plot_directive/users/whats_new-4.png \r\ncopying downloadable files... [100%] tutorials/toolkits/mplot3d.ipynb \r\ncopying static files... done\r\ncopying extra files... done\r\ndumping search index in English (code: en) ... done\r\ndumping object inventory... done\r\nbuild finished with problems, 295 warnings.\r\nembedding documentation hyperlinks...\r\nembedding documentation hyperlinks for gallery... [100%] invert_axes.html \r\nembedding documentation hyperlinks for tutorials... [100%] usage.html \r\nlib/matplotlib/colorbar.py:docstring of matplotlib.colorbar.ColorbarBase:38: WARNING: Reference py:obj Axes for lib/matplotlib/colorbar.py:docstring of matplotlib.colorbar.ColorbarBase:38 can be removed from missing-references.json.It is no longer a missing reference in the docs.\r\nlib/matplotlib/colorbar.py:docstring of matplotlib.colorbar.ColorbarBase:41: WARNING: Reference py:obj LineCollection for lib/matplotlib/colorbar.py:docstring of matplotlib.colorbar.ColorbarBase:41 can be removed from missing-references.json.It is no longer a missing reference in the docs.\r\ndoc/faq/virtualenv_faq.rst:54: WARNING: Reference py:obj https://virtualenv.pypa.io/ for doc/faq/virtualenv_faq.rst:54 can be removed from missing-references.json.It is no longer a missing reference in the docs.\r\nmake: *** [html] Error 1\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Chevron/beesting escaping issues","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"migration guide / breaking changes | from 1.0.0-alpha.4 to 1.0.0-rc1","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"can you predict the output for the below example ? as per the documentation it should give two winners. However, giving me only one winner and that too precedence of the player in the array passed. ","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"instructions for using the debugger","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"[Prerelease] v0.1.0 ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Allow \"$ref\" to take a schema, to make inlining easier?\n\n***[EDIT: See better problem statement two comments below]***\r\n\r\nNow that `$ref` can have adjacent keywords, dereferencing/inlining is more complex. (for the purpose of this issue, we're just talking about situations where inlining is possible, e.g. no cyclic references- inlining such use cases is common, with numerous libraries dedicated to this operation).\r\n\r\nIn informal discussions, we've recommended replacing `$ref` with an `allOf` containing just the referenced schema, *OR* if there is already an adjacent `allOf`, appending the referenced schema to that `allOf`. This is rather cumbersome.\r\n\r\nAt the same time, we use runtime JSON Pointer-ish constructs that look like `/properties/foo/$ref/properties/bar/$ref`, etc., to record the runtime path as we traverse references.\r\n\r\nWhat if we allowed replacing the `$ref` URI with the target schema? e.g. if `{\"$ref\": \"A\"}` points to `{A}`, then it can be replaced with `{\"$ref\": {A}}`\r\n\r\n`$ref` here is effectively a no-op, it just allows inlining the target without having to re-arrange the context.\r\n\r\nPros:\r\n* It's much easier to explain\r\n* It matches how we report runtime paths even when not dereferenced\r\n* We're already changing `$ref` inlining, so now is the time to sort this out\r\n\r\nCons:\r\n* It's a change\r\n* Strongly typed languages may be unhappy about the string-or-object behavior\r\n* ???","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Drop shadows only take effect if other filters are present","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Protocol extension methods not available from Objective-C","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Animation Rotation Direction not correct in certain degrees","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Update readme with more usage details\n\nLets update the README and add details around build/packaging usage, as well as any details around any configuration or setup that is related or can be changed","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Examples\n\nare there any examples on this? the examples link on gatsby doesn't work. It seems like this is a very complicated thing to add and I cannot find any documentation I can follow. :/","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Issues with deploying code in Chapter 7\n\n**Issue 1:**\r\n\r\nStarted DynamoDB Local, ran the POST curl command to create the seed URL, I got success:\r\n\r\n```\r\ncurl -X POST http://localhost:4000/frontier-url/dummy-seed | jq '.'\r\n % Total % Received % Xferd Average Speed Time Time Time Current\r\n Dload Upload Total Spent Left Speed\r\n100 95 100 95 0 0 416 0 --:--:-- --:--:-- --:--:-- 416\r\n{\r\n \"depth\": 0,\r\n \"seed\": \"dummy-seed\",\r\n \"url\": \"dummy-seed\",\r\n \"status\": \"PENDING\",\r\n \"createdAt\": 1565494586240\r\n}\r\n```\r\n\r\nBut then when I ran the GET curl command to list the URLs for the dummy seed, I got a failure:\r\n\r\n```\r\ncurl http://localhost:4000/frontier-url/dummy-seed\r\nEvent object failed validation\r\n```\r\n\r\nThe process running serverless local shows an error with a long stacktrace with this at the bottom:\r\n\r\n```\r\nmessage: 'Event object failed validation',\r\n details: \r\n [ { keyword: 'required',\r\n dataPath: '.queryStringParameters',\r\n schemaPath: '#/properties/queryStringParameters/required',\r\n params: [Object],\r\n message: 'should have required property status' } ]\r\n```\r\n\r\nIt talks about a \"status\" parameter but I don't know what it's talking about based on what I read in the book so far. I examined the code and found its validation schema included a `status` property with an enum type that looks like it expects the following values:\r\n\r\n```\r\nconst STATUS_VALUES = ['PENDING', 'FETCHED', 'FAILED']\r\n```\r\n\r\nI tried the following GET curl command which worked:\r\n\r\n```\r\ncurl http://localhost:4000/frontier-url/dummy-seed?status=PENDING\r\n[]\r\n```\r\n\r\nLooks like the book may need to be changed to reflect this parameter that must be provided.\r\n\r\n**Issue 2:**\r\n\r\nWhen running the command `sls invoke local -f fetch --path test-events/load-request.json` to fun the fetcher locally, it loads the fourTheorem site in my Chrome, but then I get this error in a stack trace:\r\n\r\n```\r\nmessage: 'The specified bucket is not valid.',\r\n code: 'InvalidBucketName',\r\n region: null,\r\n time: 2019-08-11T04:02:32.932Z,\r\n requestId: 'D2C6C9E014C3F17F',\r\n extendedRequestId: 'tSb494uf3oQbz/vYZXVW2iCEVa03JoI/ATWPfBf5o5wnupg2sWe+OgiYLDSrtB1NTl8VB8iOcqg=',\r\n cfId: undefined,\r\n statusCode: 400,\r\n retryable: false,\r\n retryDelay: 94.45364589611742 \r\n```\r\n\r\nI added the following code to `handler.js` in the `storeItems` function to debug:\r\n\r\n```js\r\n...\r\nlog.debug({ keyPrefix }, 'Storing assets')\r\n\r\n console.info('***item store bucket***', itemStoreBucket);\r\n\r\n return Promise.all([\r\n...\r\n```\r\n\r\nand this was that output:\r\n\r\n```\r\n***item store bucket*** #{AWS::AccountId}-dev-item-store\r\n```\r\n\r\nSo it looks like the bucket name is not being generated properly when it runs.\r\n\r\nTracing it backwards, I see this code:\r\n\r\n```js\r\nconst itemStoreBucket = process.env.ITEM_STORE_BUCKET\r\n```\r\n\r\nThis tells me it's looking for an env var for the bucket name. And I see this in `serverless.yml`:\r\n\r\n```yaml\r\nenvironment:\r\n STAGE: ${self:provider.stage}\r\n ITEM_STORE_BUCKET: ${self:custom.itemStoreBucket}\r\n```\r\n\r\n```yaml\r\ncustom:\r\n itemStoreBucket: '#{AWS::AccountId}-${self:provider.stage}-item-store'\r\n```\r\n\r\nI found a GitHub issue (https://github.com/serverless/serverless/issues/3967) that makes it sound like this is a dead end because it's not supported without adding a library to support it (\"psuedo\" variable?).\r\n\r\nI googled Serverless Framework variables and found this page in the docs (https://serverless.com/framework/docs/providers/aws/guide/variables/) and at the very bottom they include something on \"Psuedo Parameters Reference\" which includess an example showing `AWS::AccountId`. I'm not sure what to make of it right now though. This is more Serverless Framework knowledge than I have right now.\r\n\r\nBut something that crosses my mind right now. I have a bucket called `<my_aws_account_id>-dev-item-store` as of this step and it looks like the goal of using the account ID in the bucket name is to make it globally unique. Why not instruct the reader to choose a name themselves as an env var in a `.env` file like the previous chapters did? That approach seems simple and worked so far as I read the book and followed along deploying.\r\n\r\nI decided to take a stab at switching to this approach myself. I hard coded a bucket prefix, changing the code in `serverless.yml` to:\r\n\r\n```yaml\r\ncustom:\r\n itemStoreBucket: 'mattwelke-sandbox-ai-as-a-service-chap7-${self:provider.stage}-item-store'\r\n```\r\n\r\nBefore trying the local invoke again, I went back to the `item-store` directory and changed the bucket name there too, since it would be a dependency. I ran `sls deploy` there to create the new bucket.\r\n\r\nWhen I go back to the `fetch-service` and run the local invoke it works now (though it hangs after fetching, needs ctrl-c and the book doesn't mention this). Side note, at this point I realize that oddly enough the `AWS::AccountId` worked fine for the bucket creating step. Not sure why. It looks like you load the psuedo parameter plugin in both `serverless.yml` files.\r\n\r\nAt the end of the chapter I was able to deploy and run the step function, and the book tells me to check the item store. I can tell that the functions ran a few times to get the screenshots so I should see more than one, and the book shows an example of \"ai-and-machine-learning\" as a page crawled, but I only have one crawled page at this point. It could be related to the bucket issue.\r\n\r\nMy full bucket path is: `\r\nAmazon S3/mattwelke-sandbox-ai-as-a-service-chap7-dev-item-store/https%3A%2F%2Ffourtheorem.com/https%3A%2F%2Ffourtheorem.com` (`/` is the arrow icon at the top) and there are just two files:\r\n\r\n- page.html\r\n- screenshot.png","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Default user role","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"SaveAs after server finishes starting writes wrong path in new location and can delete measures from old location","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Inspect the Radial Point Geometry -plugin and figure out what it does or should do","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Clarify the license situation (closed source?)","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Rename \"Isoccluded\" to \"Partially covered\"","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Search string not parsing properly","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[Documentation] How is the AMP transformation stuff supposed to work ?","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"fs.readFile with 'utf8' option get an object when read an empty file in the ASAR ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Help needed!","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# warn The following packages use deprecated \"rnpm\" config that will stop working from next release:\n\n You can find more details at https://github.com/react-native-community/cli/blob/master/docs/configuration.md#migration-guide.\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add documentation explaining how to enable dev and test modes","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# failed using tensorrt on tx2\n\nafter installation on tx2, I found the default fps is quite low, about 5 fps, much lower than the one shown on readme. \r\nI just use $ python3 run_webcam.py --model=mobilenet_thin --resize=432x368 --camera=1 .\r\n\r\nwhile I use $ python3 run_webcam.py --model=mobilenet_thin --resize=432x368 --camera=1 --tensorrt=True, it reports error as below:\r\n\r\nwill@will-desktop:~/tf-pose-estimation$ python3 run_webcam.py --model=mobilenet_thin --resize=432x368 --camera=1 --tensorrt=True\r\n2019-08-11 18:03:18.585491: I tensorflow/stream_executor/platform/default/dso_loader.cc:42] Successfully opened dynamic library libcudart.so.10.0\r\n/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/dtypes.py:516: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/dtypes.py:517: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/dtypes.py:518: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/dtypes.py:519: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/dtypes.py:520: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorflow/python/framework/dtypes.py:525: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:541: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_qint8 = np.dtype([(\"qint8\", np.int8, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:542: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_quint8 = np.dtype([(\"quint8\", np.uint8, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:543: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_qint16 = np.dtype([(\"qint16\", np.int16, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:544: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_quint16 = np.dtype([(\"quint16\", np.uint16, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:545: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n _np_qint32 = np.dtype([(\"qint32\", np.int32, 1)])\r\n/usr/local/lib/python3.6/dist-packages/tensorboard/compat/tensorflow_stub/dtypes.py:550: FutureWarning: Passing (type, 1) or '1type' as a synonym of type is deprecated; in a future version of numpy, it will be understood as (type, (1,)) / '(1,)type'.\r\n np_resource = np.dtype([(\"resource\", np.ubyte, 1)])\r\n2019-08-11 18:03:30.661049: I tensorflow/stream_executor/platform/default/dso_loader.cc:42] Successfully opened dynamic library libcudart.so.10.0\r\nWARNING: Logging before flag parsing goes to stderr.\r\nW0811 18:03:30.744550 547506102288 deprecation_wrapper.py:119] From /home/will/tf-pose-estimation/tf_pose/mobilenet/mobilenet.py:369: The name tf.nn.avg_pool is deprecated. Please use tf.nn.avg_pool2d instead.\r\n\r\n[2019-08-11 18:03:35,496] [TfPoseEstimator-WebCam] [DEBUG] initialization mobilenet_thin : /home/will/tf-pose-estimation/models/graph/mobilenet_thin/graph_opt.pb\r\nI0811 18:03:35.496699 547506102288 run_webcam.py:42] initialization mobilenet_thin : /home/will/tf-pose-estimation/models/graph/mobilenet_thin/graph_opt.pb\r\n[2019-08-11 18:03:35,498] [TfPoseEstimator] [INFO] loading graph from /home/will/tf-pose-estimation/models/graph/mobilenet_thin/graph_opt.pb(default size=432x368)\r\nI0811 18:03:35.498320 547506102288 estimator.py:310] loading graph from /home/will/tf-pose-estimation/models/graph/mobilenet_thin/graph_opt.pb(default size=432x368)\r\nW0811 18:03:35.499779 547506102288 deprecation_wrapper.py:119] From /home/will/tf-pose-estimation/tf_pose/estimator.py:311: The name tf.gfile.GFile is deprecated. Please use tf.io.gfile.GFile instead.\r\n\r\nW0811 18:03:35.500979 547506102288 deprecation_wrapper.py:119] From /home/will/tf-pose-estimation/tf_pose/estimator.py:312: The name tf.GraphDef is deprecated. Please use tf.compat.v1.GraphDef instead.\r\n\r\nTraceback (most recent call last):\r\n File \"run_webcam.py\", line 45, in <module>\r\n e = TfPoseEstimator(get_graph_path(args.model), target_size=(w, h), trt_bool=str2bool(args.tensorrt))\r\n File \"/home/will/tf-pose-estimation/tf_pose/estimator.py\", line 327, in __init__\r\n use_calibration=True,\r\nTypeError: create_inference_graph() got an unexpected keyword argument 'use_calibration'\r\n\r\ncould someone tell me why it reports error? thanks.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Example and documentation for Angular 2/4","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Update Readme with better use directions\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Problems when trying to connect and retrieve data from Box programmatically\n\n- [x] I have checked that the [SDK documentation][sdk-docs] and [API documentation][api-docs] doesn't solve my issue\r\n\r\n### Description of the Issue\r\nCould someone provide a recommendation for a connector in Java which connects to Box and retrieves files, comments, and collaborations? I am able to do that for a particular App, using a Developer Token. However, I want to be able to not rely on Developer Tokens since they are short-lived.\r\n\r\nI have looked at:\r\n1. Connecting with OAuth. I do not see a full code sample for it. This seems more suited for interactive logins involving GUI. I want to just connect programmatically and retrieve data from Box. Hitting the first URL returns the HTML login page. It has a hidden redirect_url... I suppose I could extract that and see if I can make that work.. Seems too complicated; there must be an easier way?\r\n\r\nThere is a sample involving some Spark code\r\n`get(\"/return\", (req, res) -> {\r\n // Capture auth code \r\n String code = req.queryParams(\"code\"); \r\n // Instantiate new Box API connection object\r\n BoxAPIConnection client = new BoxAPIConnection(Config.client_id, Config.client_secret, code); \r\n // PERFORM API ACTIONS WITH CLIENT\r\n });`\r\nWhere would the code come from? I'm surmising it'd be returned from the redirect..\r\n\r\n2. Use JWT\r\n- Got the public and private key set up, downloaded the JSON generated by Box.\r\n- The below code returned an error: \"The API returned an error code [400] invalid_grant - Please check the 'sub' claim. The 'sub' specified is invalid.\"\r\n- Tried running with user ID from my app and using the Enterprise ID (which is 0). Still get the 400 error.\r\n\r\n` JWTEncryptionPreferences jwtPreferences = new JWTEncryptionPreferences();\r\n jwtPreferences.setPublicKeyID(PUBLIC_KEY_ID);\r\n jwtPreferences.setPrivateKeyPassword(PRIVATE_KEY_PASSWORD);\r\n jwtPreferences.setPrivateKey(PRIVATE_KEY);\r\n jwtPreferences.setEncryptionAlgorithm(EncryptionAlgorithm.RSA_SHA_256);\r\n\r\n BoxDeveloperEditionAPIConnection api = BoxDeveloperEditionAPIConnection.getAppUserConnection(\r\n USER_ID, CLIENT_ID,\r\n CLIENT_SECRET, jwtPreferences);\r\n\r\n BoxUser.Info userInfo = BoxUser.getCurrentUser(api).getInfo();`\r\n\r\n3. Using the App Token View\r\n- Have a primary access token and secondary.\r\n- Get the error \"The API returned an error code [403 | ... access_denied_insufficient_permissions - Access denied - insufficient permission\"\r\n\r\n`private static final String PRIMARY_ACCESS_TOKEN = \"...\"\r\n private static final String USER_ID = \"183070079\";\r\n BoxTransactionalAPIConnection api = new BoxTransactionalAPIConnection(PRIMARY_ACCESS_TOKEN);\r\n api.asUser(USER_ID);\r\n BoxFolder rootFolder = BoxFolder.getRootFolder(api);\r\n`\r\nIf I run with no user ID provided I get no content back.. presumably there's nothing in the uppermost root, has to be a specific user(s).\r\n\r\nAny hints or recommendations would be greatly appreciated.\r\nThanks.\r\n\r\n\r\n### Versions Used\r\nJava SDK: Box 2.36.0, java 1.8\r\nJava: 1.8\r\n\r\n### Steps to Reproduce\r\n<!-- Please include detailed steps to reproduce the issue you're seeing, if possible. -->\r\n<!-- If you don't have a reproducible error, please make sure that you give us as much detail -->\r\n<!-- as you can about what your application was doing when the error occurred. -->\r\n<!-- Good steps to reproduce the problem help speed up debugging for us and gets your issue resolved sooner! -->\r\n\r\n### Error Message, Including Stack Trace\r\n<!-- Replace with the full error output you're seeing, if applicable. -->\r\n<!-- Please include the full stack trace to help us identify where the error is happening. -->\r\n\r\n[sdk-docs]: ./doc\r\n[api-docs]: https://developer.box.com/docs\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# How to Delete Connector from openCTI?\n\nIs there a way to delete connector from openCTI at this time? The documentation only states to add connectors but not to remove them.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Welcome\n\n## Step 1: Enable GitHub Pages\n\nWelcome to GitHub Pages and Jekyll :tada:!\n\nIf you're new to GitHub Pages, or you want to learn how to build and host a [GitHub Pages](https://pages.github.com) site, you're in the right place. With GitHub Pages, you can host content like [documentation](https://flight-manual.atom.io/), [resumes](https://github.com/jglovier/resume-template), or any other static content that you\u2019d like.\n\nIn this course, you'll create a blog hosted on GitHub Pages and learn how to:\n\n- Enable GitHub Pages\n- Use [Jekyll](https://jekyllrb.com/), a static site generator\n- Customize Jekyll sites with a theme and content\n\n### New to GitHub?\n\nFor this course, you'll need to know how to create a branch on GitHub, commit changes using Git, and open a pull request on GitHub. If you need a refresher on the GitHub flow, check out the [the Introduction to GitHub course](https://lab.github.com/courses/introduction-to-github).\n\n### :keyboard: Activity: Generate a GitHub Pages site\n\nThe first step to publishing your blog to the web is to enable GitHub Pages on this repository <sup>[:book:](https://help.github.com/articles/github-glossary/#repository)</sup>. When you enable GitHub Pages on a repository, GitHub takes the content that's on the master branch and publishes a website based on its contents.\n\n1. Under your repository name, click [**Settings**](https://github.com/younglighting/github-pages-with-jekyll/settings).\n1. In the \"GitHub Pages\" section, in the Source drop-down, select **master branch**.\n\nAfter GitHub Pages is enabled and the site is started, we'll be ready to create some more content. \n\n> Turning on GitHub Pages creates a deployment of your repository. I may take up to a minute to respond as I await the deployment.\n\n<hr>\n<h3 align=\"center\">Return to this issue for my next comment.</h3>\n\n> _Sometimes I respond too fast for the page to update! If you perform an expected action and don't see a response from me, wait a few seconds and refresh the page for your next steps._\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Using a type needed by the injectable class in the injectable class leads to \"Can't resolve all parameters for\"","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Sync successful but did not pull emails to google group","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Document Step-by-Step instructions to the Installer Generation process","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Failing in starting a local cluster from jupyter notebook","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# make test: cannot find package\n\nWhen I run \"make test\" in the kritis directory, I reported the following error. According to the official documentation, I need to set the GOPATH environment variable, but I have set it in the ~/.bash_profile file according to the official documentation: export GOPATH=$HOME/go.\r\n\r\nGOOS=linux GOARCH=amd64 CGO_ENABLED=0 go build -ldflags \"\" -tags \"\" -o out/resolve-tags-linux-amd64 github.com/grafeas/kritis/cmd/kritis/kubectl/plugins/resolve\r\nvendor/github.com/golang/glog/glog.go:74:2: cannot find package \"bufio\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/bufio (vendor tree)\r\n\t/usr/lib/golang/src/bufio (from $GOROOT)\r\n\t/root/go/src/bufio (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:75:2: cannot find package \"bytes\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/bytes (vendor tree)\r\n\t/usr/lib/golang/src/bytes (from $GOROOT)\r\n\t/root/go/src/bytes (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/v1/v1util/zip.go:19:2: cannot find package \"compress/gzip\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/compress/gzip (vendor tree)\r\n\t/usr/lib/golang/src/compress/gzip (from $GOROOT)\r\n\t/root/go/src/compress/gzip (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/v1/hash.go:18:2: cannot find package \"crypto/sha256\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/crypto/sha256 (vendor tree)\r\n\t/usr/lib/golang/src/crypto/sha256 (from $GOROOT)\r\n\t/root/go/src/crypto/sha256 (from $GOPATH)\r\nvendor/gopkg.in/yaml.v2/decode.go:4:2: cannot find package \"encoding\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/encoding (vendor tree)\r\n\t/usr/lib/golang/src/encoding (from $GOROOT)\r\n\t/root/go/src/encoding (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/authn/basic.go:18:2: cannot find package \"encoding/base64\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/encoding/base64 (vendor tree)\r\n\t/usr/lib/golang/src/encoding/base64 (from $GOROOT)\r\n\t/root/go/src/encoding/base64 (from $GOPATH)\r\nvendor/github.com/spf13/pflag/string_slice.go:5:2: cannot find package \"encoding/csv\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/encoding/csv (vendor tree)\r\n\t/usr/lib/golang/src/encoding/csv (from $GOROOT)\r\n\t/root/go/src/encoding/csv (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/v1/hash.go:19:2: cannot find package \"encoding/hex\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/encoding/hex (vendor tree)\r\n\t/usr/lib/golang/src/encoding/hex (from $GOROOT)\r\n\t/root/go/src/encoding/hex (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/v1/remote/error.go:18:2: cannot find package \"encoding/json\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/encoding/json (vendor tree)\r\n\t/usr/lib/golang/src/encoding/json (from $GOROOT)\r\n\t/root/go/src/encoding/json (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:76:2: cannot find package \"errors\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/errors (vendor tree)\r\n\t/usr/lib/golang/src/errors (from $GOROOT)\r\n\t/root/go/src/errors (from $GOPATH)\r\ncmd/kritis/kubectl/plugins/resolve/cmd/root.go:20:2: cannot find package \"flag\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/flag (vendor tree)\r\n\t/usr/lib/golang/src/flag (from $GOROOT)\r\n\t/root/go/src/flag (from $GOPATH)\r\ncmd/kritis/kubectl/plugins/resolve/cmd/root.go:21:2: cannot find package \"fmt\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/fmt (vendor tree)\r\n\t/usr/lib/golang/src/fmt (from $GOROOT)\r\n\t/root/go/src/fmt (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/v1/hash.go:22:2: cannot find package \"hash\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/hash (vendor tree)\r\n\t/usr/lib/golang/src/hash (from $GOROOT)\r\n\t/root/go/src/hash (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:79:2: cannot find package \"io\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/io (vendor tree)\r\n\t/usr/lib/golang/src/io (from $GOROOT)\r\n\t/root/go/src/io (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/authn/keychain.go:21:2: cannot find package \"io/ioutil\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/io/ioutil (vendor tree)\r\n\t/usr/lib/golang/src/io/ioutil (from $GOROOT)\r\n\t/root/go/src/io/ioutil (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:80:2: cannot find package \"log\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/log (vendor tree)\r\n\t/usr/lib/golang/src/log (from $GOROOT)\r\n\t/root/go/src/log (from $GOPATH)\r\nvendor/gopkg.in/yaml.v2/decode.go:8:2: cannot find package \"math\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/math (vendor tree)\r\n\t/usr/lib/golang/src/math (from $GOROOT)\r\n\t/root/go/src/math (from $GOPATH)\r\nvendor/github.com/spf13/pflag/ip.go:5:2: cannot find package \"net\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/net (vendor tree)\r\n\t/usr/lib/golang/src/net (from $GOROOT)\r\n\t/root/go/src/net (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/v1/remote/transport/basic.go:18:2: cannot find package \"net/http\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/net/http (vendor tree)\r\n\t/usr/lib/golang/src/net/http (from $GOROOT)\r\n\t/root/go/src/net/http (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/name/registry.go:17:8: cannot find package \"net/url\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/net/url (vendor tree)\r\n\t/usr/lib/golang/src/net/url (from $GOROOT)\r\n\t/root/go/src/net/url (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:81:2: cannot find package \"os\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/os (vendor tree)\r\n\t/usr/lib/golang/src/os (from $GOROOT)\r\n\t/root/go/src/os (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/authn/helper.go:21:2: cannot find package \"os/exec\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/os/exec (vendor tree)\r\n\t/usr/lib/golang/src/os/exec (from $GOROOT)\r\n\t/root/go/src/os/exec (from $GOPATH)\r\nvendor/github.com/golang/glog/glog_file.go:26:2: cannot find package \"os/user\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/os/user (vendor tree)\r\n\t/usr/lib/golang/src/os/user (from $GOROOT)\r\n\t/root/go/src/os/user (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/authn/keychain.go:24:2: cannot find package \"path\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/path (vendor tree)\r\n\t/usr/lib/golang/src/path (from $GOROOT)\r\n\t/root/go/src/path (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:82:2: cannot find package \"path/filepath\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/path/filepath (vendor tree)\r\n\t/usr/lib/golang/src/path/filepath (from $GOROOT)\r\n\t/root/go/src/path/filepath (from $GOPATH)\r\nvendor/gopkg.in/yaml.v2/decode.go:9:2: cannot find package \"reflect\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/reflect (vendor tree)\r\n\t/usr/lib/golang/src/reflect (from $GOROOT)\r\n\t/root/go/src/reflect (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/v1/remote/transport/scheme.go:18:2: cannot find package \"regexp\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/regexp (vendor tree)\r\n\t/usr/lib/golang/src/regexp (from $GOROOT)\r\n\t/root/go/src/regexp (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:83:2: cannot find package \"runtime\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/runtime (vendor tree)\r\n\t/usr/lib/golang/src/runtime (from $GOROOT)\r\n\t/root/go/src/runtime (from $GOPATH)\r\nvendor/gopkg.in/yaml.v2/encode.go:9:2: cannot find package \"sort\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/sort (vendor tree)\r\n\t/usr/lib/golang/src/sort (from $GOROOT)\r\n\t/root/go/src/sort (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:84:2: cannot find package \"strconv\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/strconv (vendor tree)\r\n\t/usr/lib/golang/src/strconv (from $GOROOT)\r\n\t/root/go/src/strconv (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:85:2: cannot find package \"strings\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/strings (vendor tree)\r\n\t/usr/lib/golang/src/strings (from $GOROOT)\r\n\t/root/go/src/strings (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:86:2: cannot find package \"sync\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/sync (vendor tree)\r\n\t/usr/lib/golang/src/sync (from $GOROOT)\r\n\t/root/go/src/sync (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:87:2: cannot find package \"sync/atomic\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/sync/atomic (vendor tree)\r\n\t/usr/lib/golang/src/sync/atomic (from $GOROOT)\r\n\t/root/go/src/sync/atomic (from $GOPATH)\r\nvendor/github.com/spf13/cobra/cobra.go:25:2: cannot find package \"text/template\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/text/template (vendor tree)\r\n\t/usr/lib/golang/src/text/template (from $GOROOT)\r\n\t/root/go/src/text/template (from $GOPATH)\r\nvendor/github.com/golang/glog/glog.go:88:2: cannot find package \"time\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/time (vendor tree)\r\n\t/usr/lib/golang/src/time (from $GOROOT)\r\n\t/root/go/src/time (from $GOPATH)\r\nvendor/gopkg.in/yaml.v2/sorter.go:5:2: cannot find package \"unicode\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/unicode (vendor tree)\r\n\t/usr/lib/golang/src/unicode (from $GOROOT)\r\n\t/root/go/src/unicode (from $GOPATH)\r\nvendor/github.com/google/go-containerregistry/pkg/name/check.go:19:2: cannot find package \"unicode/utf8\" in any of:\r\n\t/root/go/src/github.com/grafeas/kritis/vendor/unicode/utf8 (vendor tree)\r\n\t/usr/lib/golang/src/unicode/utf8 (from $GOROOT)\r\n\t/root/go/src/unicode/utf8 (from $GOPATH)\r\n\r\nHas anyone encountered this kind of problem? Can you tell me how to solve it? ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"High Availability by other storage","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"mubahood","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# `Methods` section in roxygen docs malformed\n\nIn my package, I am extracting a list of available `S3` tidy methods using `generics` template:\r\n\r\nhttps://github.com/IndrajeetPatil/broomExtra/blob/56292a0fca15cd18df605c32b1f8e6d041813f25/R/generics.R#L17-L18\r\n\r\nBut, as can be seen from the webpage, this is not properly formatted:\r\nhttps://indrajeetpatil.github.io/broomExtra/reference/tidy.html#methods\r\n\r\n\r\n\r\nAny ideas on how to fix this?","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Cell Renderer Issues After 11.0.0 -> 12.0.2 Upgrade","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Ubuntu 16.04 - Error connecting client \"http: multiple response.WriteHeader calls\"\n\nWhen running through the examples on a local machine I'm getting an error ( \"http: multiple response.WriteHeader calls\") when trying to connect the `inlets client`\r\n\r\nRunning the following services:\r\n```shell\r\n# inlets server\r\n$ inlets server --port=8091 --token=\"$token\"\r\n```\r\n```shell\r\n# Local hash-browns service\r\n$ go get github.com/alexellis/hash-browns\r\n$ port=3001 hash-browns\r\n\r\n# inlets client\r\n inlets client \\\r\n --remote=127.0.0.1:8091 \\\r\n --upstream=http://127.0.0.1:3001\r\n --token $TOKEN\r\n```\r\n\r\n\r\n## Expected Behaviour\r\nThe inlets client is able to connect with the server.\r\n\r\n## Current Behaviour\r\nThe client and server return errors when connect.\r\n\r\nClient errors:\r\n```shell\r\n$ inlets client \\\r\n --remote=127.0.0.1:8091 \\\r\n --upstream=http://127.0.0.1:3001\r\n --token $TOKEN\r\n2019/08/10 20:26:23 Upstream: => http://127.0.0.1:3001\r\nmap[X-Inlets-Id:[xxxx] X-Inlets-Upstream:[=http://127.0.0.1:3001]]\r\nINFO[0000] Connecting to proxy url=\"ws://127.0.0.1:8091/tunnel\"\r\nERRO[0000] Failed to connect to proxy error=\"websocket: bad handshake\"\r\nERRO[0000] Failed to connect to proxy error=\"websocket: bad handshake\"\r\n```\r\n\r\nThe server error is:\r\n```shell\r\n$ inlets server --port=8091 --token=\"$token\"\r\n2019/08/10 20:25:42 Server token: \"xxxx\"\r\n2019/08/10 20:25:42 Listening on :8091\r\n2019/08/10 20:26:23 http: multiple response.WriteHeader calls\r\n2019/08/10 20:26:28 http: multiple response.WriteHeader calls\r\n```\r\n\r\n## Your Environment\r\n<!--- Include as many relevant details about the environment you experienced the bug in -->\r\n\r\n* inlets version `inlets --version` \r\n\r\nI'm running the following versions on client server installed from:\r\n```shell\r\n$ curl -sLS https://get.inlets.dev | sudo sh\r\n$ inlets -v\r\nVersion: 2.2.0\r\nGit Commit: 2f5e458d062e55dda9f08109f7b2c3c6919fcdf9\r\n```\r\n\r\nI've also tried installing from latest using `go get` and still get the same error:\r\n```shell\r\n$ go get -u github.com/alexellis/inlets\r\n$ inlets -v\r\nVersion: dev\r\nGit Commit:\r\n```\r\n- OS: Linux \r\n - Also tried Mac OSX as inlets client on same network, same error\r\n\r\n```\r\n$ lsb_release -a\r\nNo LSB modules are available.\r\nDistributor ID:\tUbuntu\r\nDescription:\tUbuntu 16.04.6 LTS\r\nRelease:\t16.04\r\nCodename:\txenial\r\n```\r\n\r\n```\r\n$ go version\r\ngo version go1.11 linux/amd64\r\n```\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# add missing documentation for server-side events\n\nadd missing documentation for server-side events","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# How to translate outside router, request?\n\nCurrently I have integrated successfully to routes, request and config. \r\n\r\n### Application Structure\r\n\r\n- app.js - The entry point to our application. This file defines our express server and connects it to MongoDB using mongoose. It also requires the routes and models we'll be using in the application.\r\n\r\n- config/ - This folder contains configuration for passport as well as a central location for configuration/environment variables.\r\n\r\n- routes/ - This folder contains the route definitions for our API.\r\n\r\n- models/ - This folder contains the schema definitions for our Mongoose models.\r\n\r\n**routes/api/users.js**\r\n\r\n```\r\nvar mongoose = require('mongoose');\r\nvar router = require('express').Router();\r\nvar passport = require('passport');\r\nvar User = mongoose.model('User');\r\nvar auth = require('../auth');\r\n\r\nrouter.post('/users', function(req, res, next){\r\n var user = new User();\r\n\r\n user.username = req.body.user.username;\r\n user.email = req.body.user.email;\r\n user.setPassword(req.body.user.password);\r\n\r\n user.save(req, res).then(function(){\r\n return res.json({user: user.toAuthJSON()});\r\n }).catch(next);\r\n});\r\n\r\nmodule.exports = router;\r\n\r\n```\r\n**models/User.js**\r\n\r\n```\r\nvar mongoose = require('mongoose');\r\nvar uniqueValidator = require('mongoose-unique-validator');\r\nvar crypto = require('crypto');\r\nvar jwt = require('jsonwebtoken');\r\nvar secret = require('../config').secret;\r\n\r\nvar UserSchema = new mongoose.Schema({\r\n username: {type: String, lowercase: true, unique: true, required: [true, \"can't be blank\"], match: [/^[a-zA-Z0-9]+$/, 'is invalid'], index: true},\r\n email: {type: String, lowercase: true, unique: true, required: [true, \"can't be blank\"], match: [/\\S+@\\S+\\.\\S+/, 'is invalid'], index: true},\r\n bio: String,\r\n image: String,\r\n favorites: [{ type: mongoose.Schema.Types.ObjectId, ref: 'Article' }],\r\n following: [{ type: mongoose.Schema.Types.ObjectId, ref: 'User' }],\r\n hash: String,\r\n salt: String\r\n}, {timestamps: true});\r\n\r\n /**<<HERE>> Does not work inside the models. <<HERE>>**/\r\nUserSchema.plugin(uniqueValidator, {message: i18n.__('user.register.errors.is_already_taken')});\r\n\r\nmongoose.model('User', UserSchema);\r\n\r\n```\r\n\r\n**app.js**\r\n\r\n```\r\nvar http = require('http'),\r\n path = require('path'),\r\n methods = require('methods'),\r\n express = require('express'),\r\n bodyParser = require('body-parser'),\r\n session = require('express-session'),\r\n cors = require('cors'),\r\n passport = require('passport'),\r\n errorhandler = require('errorhandler'),\r\n mongoose = require('mongoose');\r\n cookieParser = require('cookie-parser');\r\n i18n = require('i18n');\r\n\r\nvar isProduction = process.env.NODE_ENV === 'production';\r\n\r\n// Create global app object\r\nvar app = express();\r\n\r\napp.use(cors());\r\n\r\napp.use(cookieParser())\r\n\r\n// Configure i18n\r\ni18n.configure({\r\n locales:['en', 'es'],\r\n directory: __dirname + '/locales',\r\n objectNotation: true,\r\n //defaultLocale: 'en',\r\n register: global,\r\n cookie: 'lang'\r\n})\r\n\r\ni18n.init();\r\n\r\n// you will need to use cookieParser to expose cookies to req.cookies\r\napp.use(cookieParser());\r\n\r\n// i18n init parses req for language headers, cookies, etc.\r\napp.use(i18n.init);\r\n\r\n// Support \"Accept-Language\" : \"es\" or \"en\" via headers\r\napp.use(function (req, res, next) {\r\n i18n.init(req, res);\r\n\r\n if (typeof req.locale !== 'undefined') {\r\n console.log('locale : ', req.locale);\r\n i18n.setLocale(req.locale);\r\n }else{\r\n var locale = i18n.getLocale();\r\n res.set('Content-Language', locale);\r\n }\r\n next();\r\n});\r\n// finally, let's start our server...\r\nvar server = app.listen( process.env.PORT || 3000, function(){\r\n console.log('Listening on port ' + server.address().port);\r\n});\r\n\r\n```\r\n\r\nBut when I try translate inside a **models/** for example i18n.__('my-key'), this does not detect the locale so this returns the locale for default \"en\", but from header i receive correctly the lang \"es\".\r\n\r\nPlease let me know how I will fix it :)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update to latest Angular version","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Create a diagnostic if ONBUILD trigger instructions are not written in uppercase","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"While using autocomplete in modal form any click outside tags input doesn't close autocomplete items","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":".gitattributes linguist-vendored attribute not working unless 'true' is set","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Making dialog titles use Book Style Capitalization [needs-docs]","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# CVE-2018-11698 (High) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2018-11698 - High Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>\n<p>\n\n<p>A Java based fault and performance management system</p>\n<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nAn issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::handle_error which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.\n\n<p>Publish Date: 2018-06-04\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11698>CVE-2018-11698</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11698\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11698</a></p>\n<p>Release Date: 2019-08-06</p>\n<p>Fix Resolution: 3.6.0</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# No information regarding docker setup\n\n### Missing docs on using docker-compose \r\nNo docs on instructions regarding the use of the docker-compose setup.\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"self invocating function doesn't require explicit call.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Clean up licence","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Convert documentation from google slides to HTML\n\nSlides can be found here: https://bit.ly/rx-marble-design-system_slides","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Unable to right click on the hamburger menu filter","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Fully document SPA tokens\n\n <!-- \r\n Please read contribution guidelines first: <to be added>\r\n-->\r\n\r\n## Description of documentation change\r\nFully document the SPA tokens. This blog provides most of the tokens but does not document key items like the _provider_ attribute of the `[JavaScript...]` token with the valid values for the attribute or an example of its usage.\r\n\r\nhttps://www.dnnsoftware.com/community-blog/cid/155247/module-development-in-dnn-8-5--new-tokens-to-support-building-pure-spa-modules\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# dna4_revcomp\n\nSolve these issues before release.\r\n\r\n- [ ] Finish the implementation.\r\n- [ ] Write documentation.\r\n- [ ] Write a manpage.\r\n- [ ] Create unit tests. Verify the all implementations and the dispatching.\r\n- [ ] Check exported symbols.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Add contributor\n\n@all-contributors please add @jtsom for docs","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# documentation for `tidy()` goes nowhere\n\nWhile trying to diagnose the source of https://github.com/r-lib/generics/issues/43, I noticed that `tidy()` documentation in `broom` website goes nowhere:\r\n\r\n\r\n\r\nI think this is because you are using the `CRAN` version of `pkgdown`-\r\n\r\nhttps://github.com/tidymodels/broom/blob/4a6cdcb78d8508236bc7ef1b47ae0e461b86d478/.travis.yml#L15-L22\r\n\r\nIn the development version of `pkgdown`, functions are linked to their documentation on https://rdrr.io\r\n\r\nFor example, here is a template from `rlang`-\r\n\r\nhttps://github.com/r-lib/rlang/blob/09fbc8618bb0a886d2d4546e8cbd378b1c178f65/.travis.yml#L14-L21","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Feature: S3 Bucket Notification Support in Cluster Configurations","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[REVIEW]: rtimicropem: an R package supporting the analysis of RTI MicroPEM output files","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Google Docs is having problem, strange reading voice\n\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Documentation bug or incomplete check for python shellHook development mode\n\nhttps://github.com/NixOS/nixpkgs/blob/master/doc/languages-frameworks/python.section.md#L392 states `If we create a shell.nix file which calls buildPythonPackage, and if src is a local source, and if the local source has a setup.py, then development mode is activated.`\r\nbut it is not actually checked whether `src` is a local path. ","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Transition plan for node-github","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Use automatic port allocation","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Steps in the exercise are too big","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"How to establish a blood relationship to the two tables?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"What is the Hero Card Attachment limit for a message in MS Teams?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Docs for OKD deployment\n\nI've been attempting to deploy OKD to BYO Infrastructure by setting \r\n```\r\nopenshift_deployment_type: origin\r\ndeployment_type: origin\r\n```\r\nCurrently running into the error:\r\n\r\n`Error reading manifest latest in registry.redhat.io/openshift/origin: unknown: Not Found`\r\n\r\nI'm assuming there's issues with my configuration but can't seem to find documentation on how to use casl-ansible for an OKD deployment (or if it's even it scope)","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Clarity on libraries \n\n**Is your feature request related to a problem? Please describe.**\r\nI'm a bit confused at the moment I saw that you are a contributor on https://github.com/testcontainers/testcontainers-dotnet. What was the reason you created dotnet-testcontainers instead of continuing on the other one ?\r\n\r\n**Describe the solution you'd like**\r\nMaybe add a section to the readme how this relates to testcontainers-dotnet\r\n\r\n**Describe alternatives you've considered**\r\n\r\n**Additional context**\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# writeCharacteristicWithResponseForService the promise always returns \"Operation was rejected\"\n\n## Prerequisites\r\nReact Native 0.60.4\r\nReact Native Ble Plx 1.0.3\r\n- [x] I am running the latest version\r\n- [x] I checked the documentation and found no answer\r\n- [x] I checked to make sure that this issue has not already been filed\r\n- [x] I'm sure that question is related to the library itself and not Bluetooth Low Energy or Classic in general. If that so, please post your question on [StackOverflow](https://stackoverflow.com/questions/tagged/react-native-ble-plx?sort=active) or on our [Gitter](https://gitter.im/RxBLELibraries/react-native-ble) channel.\r\n\r\n## Question\r\nHello, I'm develop application to connect with smart bands, but when I write characteristic (00002a37-0000-1000-8000-00805f9b34fb) of heart rate service (0000180d-0000-1000-8000-00805f9b34fb), the promise always returns error and message is \"Operation was rejected\".\r\n\r\n### My code is:\r\n\r\n```javascript\r\nconst numberBuffer = Buffer.alloc(2);\r\nnumberBuffer.writeUInt16LE(1, 0);\r\nconsole.warn(numberBuffer.toString('base64'))\r\nconst characteristic = await device.writeCharacteristicWithResponseForService(\r\n service, characteristicW, numberBuffer.toString('base64')\r\n)\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Consider OGP/Twitter Card for sharing\n\nIt would be great if slackemogigen has below, so sharing slackemogigen on SNS is more amusing.\r\n\r\n* Twitter card https://developer.twitter.com/en/docs/basics/getting-started\r\n* Open Graph protocol https://ogp.me/\r\n * Slack, Facebook and other SNS","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Bad anchor tag in the contributing docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add me!","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"CKWebSpeech Voice to Text Plugin not working","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"CMAKE instructions for C++ are not exactly correct","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"CONTRIBUTING.md guidelines or template","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Latest doctrine/instantiator bugs out with patchwork","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Remove electron references from docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# netdata master/slave communication Do I need open some port?\n\nHi, good day, I can't find in the documentation what ports must I open in my firewall for the master/slave communication, it's just the port 19999?, can you give me this info?...\r\n\r\nalso, what happens when I have my master netdata under a proxy?...I set nginx for basic http authentication for my dashboard and right now I'm forwarding the port 19998 to the port 8889 so I enter to my dashboard using the port 8889, I think that if my slave expect communicate with the master netdata system probably I'd need set the new port in my slave...where must I do that?...thank you\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"CapabilityStatement/Conformance in CDS Hooks","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Improve documentation\n\nNeed to cleanup the readme and move documentation to `doc` directory or make use of GitHub wiki, etc.\r\n\r\nWhatever makes it easier to use and maintain. \r\n\r\nOne long readme is a pain to maintain and organise.\r\n\r\nOpen to ideas. ","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Fix homepage links to point to truffleframework.com/docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Inform README readers that the documentation reflects master and not the latest release\n\nSo that confusions like the one in https://github.com/cedarcode/webauthn-ruby/issues/201#issuecomment-493105848 doesn't happen.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add build instructions","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Bot likes but do not follows\n\nHi,\r\n\r\nThanks for packaging instabot.py into a docker, really useful to run it on a Synology NAS.\r\nHowever I'm experiencing an issue as the bot is liking pictures but not following any user after many hours.\r\nI changed some parameters from *_per_day to *_per_run as mentioned in the instabot.py documentation, but it's not fixing the issue.\r\n\r\nRegards\r\n\r\n[instabot.config.yml.zip](https://github.com/feedsbrain/instabot-docker/files/3489662/instabot.config.yml.zip)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Susy and Sass issue ....","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add Installing and Deploying information","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Docker for Windows fails to start","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Tags input component should have a `placeholder` prop\n\n!!! IF YOU DO NOT USE THIS ISSUES TEAMPLATE, YOUR ISSUE IS LIABLE TO BEING IGNORED BY US\r\n\r\n\r\n# Prerequisites\r\n\r\nPlease answer the following questions for yourself before submitting an issue.\r\n\r\n- [x] I am running the latest version\r\n- [x] I checked the documentation and found no answer\r\n- [x] I checked to make sure that this issue has not already been filed\r\n- [x] I'm reporting the issue to the correct repository (for multi-repository projects)\r\n\r\n# Expected Behavior\r\n\r\nTags input has placeholder\r\n\r\n# Current Behavior\r\n\r\nIt doesn't\r\n\r\n# Failure Information (for bugs)\r\n\r\nn/a\r\n\r\n## Steps to Reproduce\r\n\r\nn/a\r\n\r\n## Context\r\n\r\nI wanted to make a PR, but in free version there is no tags input, so it seems to be only this place suitable.\r\n\r\nTags input component should have a `placeholder` prop:\r\n```\r\nprops: {\r\n ...\r\n placeholder: {\r\n type: String,\r\n default: 'Add new tag',\r\n description: 'Placeholder for tag input'\r\n }\r\n}\r\n```\r\nAnd tag input should:\r\n```\r\n<input type=\"text\" :placeholder=\"placeholder\" ... >\r\n```\r\n\r\nBecause:\r\n1. You don't always add tags with it (keywords, for example)\r\n2. You don't always make it in English\r\n\r\n## Failure Logs\r\n\r\nn/a\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"arek - Project 3 - Gtihub instructions","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Multiple event temporal correlation when firing an EventAction","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"`DataCollection->toArray` breaks grouping","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Count the blocks until the next turn","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Error creating pull request: Unprocessable Entity (HTTP 422) Invalid value for \"head\"","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Request for documentation on how webpack processes plugins and when a plugin should call `doResolve()`","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Feature: Show CLI for -query flag by default\n\nHi. I often find it useful to show CLI even when -query flag is specified. AFAIK, currently users can't get this behavior without patching the sources.\r\n\r\nPlease consider adding this feature. One could use my hack as a reference.\r\n\r\nhttps://github.com/grwlf/vim-grepper/commit/231d06df26ff9d5df38d82e63630eb868cd10e64","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Readthedoc.io link dead","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# problem with documentation of spherical SLD model\n\nAs reported by Hassan at Barc.n the description of SASview documentation for spheres with spherical sld, the web page\r\nhttp://www.sasview.org/docs/user/models/spherical_sld.html\r\n\r\nindicates some typos, I believe. This could be corrected.\r\n\r\n1)` For a spherically symmetric particle with a particle density \u03c1x(r) the sld function can be defined as:`\r\nIn this formula of f = the denominator should be qr , instead of qr^2\r\n\r\n2) In the subsequent formulae, for f of the core, shell , interface, solvent etc. \r\nthe denominator should contain (qr)^3 , instead of qr^3\r\n\r\n3) Please check the sign for the solvent term\r\n\r\n----------------------\r\nCertainly there is an inconsistency for 1. point 2 seems correct but was clearly very deliberate so should check the math. Finally the code should be checked to verify that it does what the docs says it does.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# OSError: Could not find lib c or load any of its variants []\n\nIssue:\r\n\r\nAfter executing this line ```from shapely.geometry import Line``` I get issues on this \r\n\r\nOSError: Could not find lib c or load any of its variants []. \r\n\r\nSo, I created a new environment in anaconda and set priority to conda-forge so that I wouldn't be mixing packages from different channels as mentioned here: https://conda-forge.org/docs/user/tipsandtricks.html#how-to-fix-it\r\n\r\nHowever, as I install geopandas and run this line in the interpreter, \r\n```\r\nfrom shapely.geometry import Line\r\n```\r\nthe same error comes out.\r\n\r\n- I have tried this: https://conda-forge.org/docs/user/tipsandtricks.html#how-to-fix-it as mentioned in another thread here.\r\n\r\n- I tried setting this: export DYLD_FALLBACK_LIBRARY_PATH=$(HOME)/lib:/usr/local/lib:/lib:/usr/lib \r\n\r\nAlso, I have no other geos installed in my system via homebrew or macports \r\n\r\n**`brew list`**\r\n\r\n```\r\n(base) Jacobs-MacBook-Pro:~ jacob$ brew list\r\napr\t\tgrib2\t\tlibmpc\t\tnetcdf\t\tsqlite\r\napr-util\thdf5\t\tlibsodium\topenssl\t\tsubversion\r\ncmake\t\tinetutils\tlibunistring\tperl\t\tszip\r\ngcc\t\tisl\t\tlz4\t\tproj\t\tutf8proc\r\ngdbm\t\tjpeg\t\tmpfr\t\tpython\t\twget\r\ngettext\t\tlibidn\t\tmpich\t\treadline\txz\r\ngmp\t\tlibidn2\t\tmpich3\t\tspatialindex\tzeromq\r\n\r\n(base) Jacobs-MacBook-Pro:~ jacob$ brew uninstall geos\r\nError: No such keg: /usr/local/Cellar/geos\r\n```\r\n\r\n** `brew cask list` **\r\n\r\n```\r\n(base) Jacobs-MacBook-Pro:~ jacob$ brew cask list\r\njava xquartz\r\n```\r\n\r\nSteps done:\r\n\r\n1. Create a new environment GIS\r\n2. Set channel priority to conda-forge\r\n3. installed geopandas\r\n4. Tried executing ```from shapely.geometry import Line``` however, I still get the same error.\r\n\r\n\r\n<br/>\r\nEnvironment (<code>conda list</code>):\r\n<details>\r\n\r\n```\r\n(GIS) bash-3.2$ conda list\r\n# packages in environment at /Users/jacob/anaconda3/envs/GIS:\r\n#\r\n# Name Version Build Channel\r\nattrs 19.1.0 py_0 conda-forge\r\nboost-cpp 1.70.0 h75728bb_2 conda-forge\r\nbzip2 1.0.8 h01d97ff_0 conda-forge\r\nca-certificates 2019.6.16 hecc5488_0 conda-forge\r\ncairo 1.16.0 he1c11cd_1002 conda-forge\r\ncertifi 2019.6.16 py37_1 conda-forge\r\ncfitsio 3.470 h389770f_1 conda-forge\r\nclick 7.0 py_0 conda-forge\r\nclick-plugins 1.1.1 py_0 conda-forge\r\ncligj 0.5.0 py_0 conda-forge\r\ncurl 7.65.3 h22ea746_0 conda-forge\r\nexpat 2.2.5 h6de7cb9_1003 conda-forge\r\nfiona 1.8.6 py37h39889d8_4 conda-forge\r\nfontconfig 2.13.1 h6b1039f_1001 conda-forge\r\nfreetype 2.10.0 h24853df_1 conda-forge\r\nfreexl 1.0.5 h1de35cc_1002 conda-forge\r\ngdal 2.4.2 py37h39889d8_4 conda-forge\r\ngeopandas 0.5.1 py_0 conda-forge\r\ngeos 3.7.2 h6de7cb9_1 conda-forge\r\ngeotiff 1.5.1 h83de174_2 conda-forge\r\ngettext 0.19.8.1 h46ab8bc_1002 conda-forge\r\ngiflib 5.1.7 h01d97ff_1 conda-forge\r\nglib 2.58.3 h9d45998_1002 conda-forge\r\nhdf4 4.2.13 hf3c6af0_1002 conda-forge\r\nhdf5 1.10.5 nompi_h0cbb7df_1100 conda-forge\r\nicu 64.2 h6de7cb9_0 conda-forge\r\njpeg 9c h1de35cc_1001 conda-forge\r\njson-c 0.13.1 h1de35cc_1001 conda-forge\r\nkealib 1.4.10 h6659575_1005 conda-forge\r\nkrb5 1.16.3 hcfa6398_1001 conda-forge\r\nlibblas 3.8.0 11_openblas conda-forge\r\nlibcblas 3.8.0 11_openblas conda-forge\r\nlibcurl 7.65.3 h16faf7d_0 conda-forge\r\nlibcxx 8.0.0 4 conda-forge\r\nlibcxxabi 8.0.0 4 conda-forge\r\nlibdap4 3.20.2 hae55d67_1000 conda-forge\r\nlibedit 3.1.20170329 hcfe32e1_1001 conda-forge\r\nlibffi 3.2.1 h6de7cb9_1006 conda-forge\r\nlibgdal 2.4.2 hf77bb78_4 conda-forge\r\nlibgfortran 3.0.1 0 conda-forge\r\nlibiconv 1.15 h01d97ff_1005 conda-forge\r\nlibkml 1.3.0 hed7d534_1010 conda-forge\r\nliblapack 3.8.0 11_openblas conda-forge\r\nlibnetcdf 4.6.2 h1a02027_1002 conda-forge\r\nlibopenblas 0.3.6 hd44dcd8_6 conda-forge\r\nlibpng 1.6.37 h2573ce8_0 conda-forge\r\nlibpq 11.5 h756f0eb_0 conda-forge\r\nlibspatialindex 1.9.0 h6de7cb9_1 conda-forge\r\nlibspatialite 4.3.0a he369b6e_1029 conda-forge\r\nlibssh2 1.8.2 hcdc9a53_2 conda-forge\r\nlibtiff 4.0.10 hd08fb8f_1003 conda-forge\r\nlibxml2 2.9.9 h12c6b28_2 conda-forge\r\nlz4-c 1.8.3 h6de7cb9_1001 conda-forge\r\nmunch 2.3.2 py_0 conda-forge\r\nncurses 6.1 h0a44026_1002 conda-forge\r\nnumpy 1.17.0 py37h6b0580a_0 conda-forge\r\nopenjpeg 2.3.1 hc1feee7_0 conda-forge\r\nopenssl 1.1.1c h01d97ff_0 conda-forge\r\npandas 0.25.0 py37h86efe34_0 conda-forge\r\npcre 8.41 h0a44026_1003 conda-forge\r\npip 19.2.1 py37_0 conda-forge\r\npixman 0.38.0 h01d97ff_1003 conda-forge\r\npoppler 0.67.0 hd5eb092_7 conda-forge\r\npoppler-data 0.4.9 1 conda-forge\r\npostgresql 11.5 h25afefd_0 conda-forge\r\nproj4 6.1.0 h2cc77ee_2 conda-forge\r\npyproj 2.2.1 py37h804dea5_0 conda-forge\r\npython 3.7.3 h93065d6_1 conda-forge\r\npython-dateutil 2.8.0 py_0 conda-forge\r\npytz 2019.2 py_0 conda-forge\r\nreadline 8.0 hcfe32e1_0 conda-forge\r\nrtree 0.8.3 py37h666c49c_1002 conda-forge\r\nsetuptools 41.0.1 py37_0 conda-forge\r\nshapely 1.6.4 py37h0567c5e_1006 conda-forge\r\nsix 1.12.0 py37_1000 conda-forge\r\nsqlite 3.29.0 hb7d70f7_0 conda-forge\r\ntk 8.6.9 h2573ce8_1002 conda-forge\r\ntzcode 2019a h01d97ff_1002 conda-forge\r\nwheel 0.33.4 py37_0 conda-forge\r\nxerces-c 3.2.2 hbda6038_1004 conda-forge\r\nxz 5.2.4 h1de35cc_1001 conda-forge\r\nzlib 1.2.11 h01d97ff_1005 conda-forge\r\nzstd 1.4.0 ha9f0a20_0 conda-forge\r\n\r\n```\r\n</details>\r\n\r\n<br/>\r\nDetails about <code>conda</code> and system ( <code>conda info</code> ):\r\n<details>\r\n\r\n```\r\nactive environment : GIS\r\n active env location : /Users/jacob/anaconda3/envs/GIS\r\n shell level : 2\r\n user config file : /Users/jacob/.condarc\r\n populated config files : /Users/jacob/.condarc\r\n conda version : 4.7.10\r\n conda-build version : 3.18.8\r\n python version : 3.7.3.final.0\r\n virtual packages : \r\n base environment : /Users/jacob/anaconda3 (writable)\r\n channel URLs : https://conda.anaconda.org/conda-forge/osx-64\r\n https://conda.anaconda.org/conda-forge/noarch\r\n https://repo.anaconda.com/pkgs/main/osx-64\r\n https://repo.anaconda.com/pkgs/main/noarch\r\n https://repo.anaconda.com/pkgs/r/osx-64\r\n https://repo.anaconda.com/pkgs/r/noarch\r\n package cache : /Users/jacob/anaconda3/pkgs\r\n /Users/jacob/.conda/pkgs\r\n envs directories : /Users/jacob/anaconda3/envs\r\n /Users/jacob/.conda/envs\r\n platform : osx-64\r\n user-agent : conda/4.7.10 requests/2.22.0 CPython/3.7.3 Darwin/18.7.0 OSX/10.14.6\r\n UID:GID : 501:20\r\n netrc file : None\r\n offline mode : False\r\n\r\n```\r\n</details>\r\n\r\n\r\n</details>\r\n\r\n<br/>\r\nDetails about <code>conda</code> and system ( <code>conda info -s</code> ):\r\n<details>\r\n\r\n```\r\n(GIS) bash-3.2$ conda info -s\r\nsys.version: 3.7.3 (default, Mar 27 2019, 16:54:48) \r\n...\r\nsys.prefix: /Users/jacob/anaconda3\r\nsys.executable: /Users/jacob/anaconda3/bin/python\r\nconda location: /Users/jacob/anaconda3/lib/python3.7/site-packages/conda\r\nconda-build: /Users/jacob/anaconda3/bin/conda-build\r\nconda-convert: /Users/jacob/anaconda3/bin/conda-convert\r\nconda-debug: /Users/jacob/anaconda3/bin/conda-debug\r\nconda-develop: /Users/jacob/anaconda3/bin/conda-develop\r\nconda-env: /Users/jacob/anaconda3/bin/conda-env\r\nconda-index: /Users/jacob/anaconda3/bin/conda-index\r\nconda-inspect: /Users/jacob/anaconda3/bin/conda-inspect\r\nconda-metapackage: /Users/jacob/anaconda3/bin/conda-metapackage\r\nconda-render: /Users/jacob/anaconda3/bin/conda-render\r\nconda-server: /Users/jacob/anaconda3/bin/conda-server\r\nconda-skeleton: /Users/jacob/anaconda3/bin/conda-skeleton\r\nconda-verify: /Users/jacob/anaconda3/bin/conda-verify\r\nuser site dirs: \r\n\r\nCIO_TEST: <not set>\r\nCONDA_DEFAULT_ENV: GIS\r\nCONDA_EXE: /Users/jacob/anaconda3/bin/conda\r\nCONDA_PREFIX: /Users/jacob/anaconda3/envs/GIS\r\nCONDA_PREFIX_1: /Users/jacob/anaconda3\r\nCONDA_PROMPT_MODIFIER: (GIS) \r\nCONDA_PYTHON_EXE: /Users/jacob/anaconda3/bin/python\r\nCONDA_ROOT: /Users/jacob/anaconda3\r\nCONDA_SHLVL: 2\r\nPATH: /Users/jacob/anaconda3/bin:/Users/jacob/anaconda3/envs/GIS/bin:/Users/jacob/anaconda3/condabin:/usr/local/git/bin:/sw/bin:/usr/local/bin:/usr/local:/usr/local/sbin:/usr/local/mysql/bin:/usr/local/Cellar/netcdf/4.6.3_1/bin:/usr/local/Cellar/mpich3/include:/usr/local/opt/python/libexec/bin:/Users/jacob/Desktop/WRF4/ncl/ncl_6/bin:/usr/bin:/bin:/usr/sbin:/sbin:/usr/local/bin:/usr/X11/bin:/usr/local/opt/inetutils/libexec/gnubin:/usr/local/bin:/usr/local/sbin:/Users/jacob/bin:/Library/Frameworks/Python.framework/Versions/3.6/bin:/usr/local/bin:/usr/bin:/bin:/usr/sbin:/sbin:/opt/X11/bin\r\nREQUESTS_CA_BUNDLE: <not set>\r\nSSL_CERT_FILE: <not set>\r\n\r\n```\r\n</details>\r\n\r\n</details>\r\n\r\n<br/>\r\nDetails about <code>cat .condarc</code>:\r\n<details>\r\n\r\n```\r\n`(GIS) bash-3.2$ cat .condarc\r\nssl_verify: true\r\nchannels:\r\n - conda-forge\r\n - defaults\r\nchannel_priority: strict`\r\n\r\n```\r\n</details>\r\n\r\nThe error from the python prompt is still the same:\r\n\r\n`from shapely.geometry import Line`\r\n\r\n```\r\nLast login: Sat Aug 10 23:00:22 on ttys000\r\n-bash: HOME: command not found\r\n/Users/jacob/.anaconda/navigator/a.tool ; exit;\r\n(base) Jacobs-MacBook-Pro:~ jacob$ /Users/jacob/.anaconda/navigator/a.tool ; exit;\r\nPython 3.7.3 | packaged by conda-forge | (default, Jul 1 2019, 14:38:56) \r\n[Clang 4.0.1 (tags/RELEASE_401/final)] :: Anaconda, Inc. on darwin\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> from shapely.geometry import Line\r\nTraceback (most recent call last):\r\n File \"<stdin>\", line 1, in <module>\r\n File \"/Users/jacob/anaconda3/envs/GIS/lib/python3.7/site-packages/shapely/geometry/__init__.py\", line 4, in <module>\r\n from .base import CAP_STYLE, JOIN_STYLE\r\n File \"/Users/jacob/anaconda3/envs/GIS/lib/python3.7/site-packages/shapely/geometry/base.py\", line 17, in <module>\r\n from shapely.coords import CoordinateSequence\r\n File \"/Users/jacob/anaconda3/envs/GIS/lib/python3.7/site-packages/shapely/coords.py\", line 8, in <module>\r\n from shapely.geos import lgeos\r\n File \"/Users/jacob/anaconda3/envs/GIS/lib/python3.7/site-packages/shapely/geos.py\", line 113, in <module>\r\n free = load_dll('c').free\r\n File \"/Users/jacob/anaconda3/envs/GIS/lib/python3.7/site-packages/shapely/geos.py\", line 56, in load_dll\r\n libname, fallbacks or []))\r\nOSError: Could not find lib c or load any of its variants [].\r\n```\r\n\r\n\r\nList of files inside ../GIS/lib:\r\n```\r\n(base) Jacobs-MacBook-Pro:lib jacob$ pwd\r\n/Users/jacob/anaconda3/envs/GIS/lib\r\n(base) Jacobs-MacBook-Pro:lib jacob$ ls\r\nGNU.Gettext.dll\t\t\t\tlibicui18n.dylib\r\nTk.icns\t\t\t\t\tlibicuio.64.2.dylib\r\nTk.tiff\t\t\t\t\tlibicuio.64.dylib\r\n_int.so\t\t\t\t\tlibicuio.dylib\r\nadminpack.so\t\t\t\tlibicutest.64.2.dylib\r\namcheck.so\t\t\t\tlibicutest.64.dylib\r\nascii_and_mic.so\t\t\tlibicutest.dylib\r\nauth_delay.so\t\t\t\tlibicutu.64.2.dylib\r\nauto_explain.so\t\t\t\tlibicutu.64.dylib\r\nautoinc.so\t\t\t\tlibicutu.dylib\r\nbloom.so\t\t\t\tlibicuuc.64.2.dylib\r\nbtree_gin.so\t\t\t\tlibicuuc.64.dylib\r\nbtree_gist.so\t\t\t\tlibicuuc.dylib\r\ncairo\t\t\t\t\tlibintl.8.dylib\r\ncharset.alias\t\t\t\tlibintl.a\r\ncitext.so\t\t\t\tlibintl.dylib\r\ncmake\t\t\t\t\tlibjpeg.9.dylib\r\ncube.so\t\t\t\t\tlibjpeg.a\r\ncyrillic_and_mic.so\t\t\tlibjpeg.dylib\r\ndblink.so\t\t\t\tlibjson-c.4.dylib\r\ndict_int.so\t\t\t\tlibjson-c.a\r\ndict_snowball.so\t\t\tlibjson-c.dylib\r\ndict_xsyn.so\t\t\t\tlibk5crypto.3.1.dylib\r\nearthdistance.so\t\t\tlibk5crypto.3.dylib\r\nengines-1.1\t\t\t\tlibk5crypto.dylib\r\neuc2004_sjis2004.so\t\t\tlibkadm5clnt.dylib\r\neuc_cn_and_mic.so\t\t\tlibkadm5clnt_mit.11.0.dylib\r\neuc_jp_and_sjis.so\t\t\tlibkadm5clnt_mit.11.dylib\r\neuc_kr_and_mic.so\t\t\tlibkadm5clnt_mit.dylib\r\neuc_tw_and_big5.so\t\t\tlibkadm5srv.dylib\r\nfile_fdw.so\t\t\t\tlibkadm5srv_mit.11.0.dylib\r\nfuzzystrmatch.so\t\t\tlibkadm5srv_mit.11.dylib\r\ngettext\t\t\t\t\tlibkadm5srv_mit.dylib\r\ngirepository-1.0\t\t\tlibkdb5.9.0.dylib\r\nglib-2.0\t\t\t\tlibkdb5.9.dylib\r\nhstore.so\t\t\t\tlibkdb5.dylib\r\nicu\t\t\t\t\tlibkea.1.4.10.dylib\r\ninsert_username.so\t\t\tlibkea.1.4.dylib\r\nisn.so\t\t\t\t\tlibkea.dylib\r\nitcl4.1.2\t\t\t\tlibkmlbase.1.3.0.dylib\r\nkrb5\t\t\t\t\tlibkmlbase.1.dylib\r\nlatin2_and_win1250.so\t\t\tlibkmlbase.dylib\r\nlatin_and_mic.so\t\t\tlibkmlconvenience.1.3.0.dylib\r\nlibasprintf.0.dylib\t\t\tlibkmlconvenience.1.dylib\r\nlibasprintf.a\t\t\t\tlibkmlconvenience.dylib\r\nlibasprintf.dylib\t\t\tlibkmldom.1.3.0.dylib\r\nlibblas.3.dylib\t\t\t\tlibkmldom.1.dylib\r\nlibblas.dylib\t\t\t\tlibkmldom.dylib\r\nlibboost_atomic.a\t\t\tlibkmlengine.1.3.0.dylib\r\nlibboost_atomic.dylib\t\t\tlibkmlengine.1.dylib\r\nlibboost_chrono.a\t\t\tlibkmlengine.dylib\r\nlibboost_chrono.dylib\t\t\tlibkmlregionator.1.3.0.dylib\r\nlibboost_container.a\t\t\tlibkmlregionator.1.dylib\r\nlibboost_container.dylib\t\tlibkmlregionator.dylib\r\nlibboost_context.a\t\t\tlibkmlxsd.1.3.0.dylib\r\nlibboost_context.dylib\t\t\tlibkmlxsd.1.dylib\r\nlibboost_contract.a\t\t\tlibkmlxsd.dylib\r\nlibboost_contract.dylib\t\t\tlibkrad.0.0.dylib\r\nlibboost_coroutine.a\t\t\tlibkrad.0.dylib\r\nlibboost_coroutine.dylib\t\tlibkrad.dylib\r\nlibboost_date_time.a\t\t\tlibkrb5.3.3.dylib\r\nlibboost_date_time.dylib\t\tlibkrb5.3.dylib\r\nlibboost_exception.a\t\t\tlibkrb5.dylib\r\nlibboost_fiber.a\t\t\tlibkrb5support.1.1.dylib\r\nlibboost_fiber.dylib\t\t\tlibkrb5support.1.dylib\r\nlibboost_filesystem.a\t\t\tlibkrb5support.dylib\r\nlibboost_filesystem.dylib\t\tliblapack.3.dylib\r\nlibboost_graph.a\t\t\tliblapack.dylib\r\nlibboost_graph.dylib\t\t\tliblz4.1.8.3.dylib\r\nlibboost_iostreams.a\t\t\tliblz4.1.dylib\r\nlibboost_iostreams.dylib\t\tliblz4.a\r\nlibboost_locale.a\t\t\tliblz4.dylib\r\nlibboost_locale.dylib\t\t\tliblzma.5.dylib\r\nlibboost_log.a\t\t\t\tliblzma.a\r\nlibboost_log.dylib\t\t\tliblzma.dylib\r\nlibboost_log_setup.a\t\t\tlibmenu.6.dylib\r\nlibboost_log_setup.dylib\t\tlibmenu.a\r\nlibboost_math_c99.a\t\t\tlibmenu.dylib\r\nlibboost_math_c99.dylib\t\t\tlibmenuw.6.dylib\r\nlibboost_math_c99f.a\t\t\tlibmenuw.a\r\nlibboost_math_c99f.dylib\t\tlibmenuw.dylib\r\nlibboost_math_c99l.a\t\t\tlibmfhdf.0.dylib\r\nlibboost_math_c99l.dylib\t\tlibmfhdf.a\r\nlibboost_math_tr1.a\t\t\tlibmfhdf.dylib\r\nlibboost_math_tr1.dylib\t\t\tlibminizip.dylib\r\nlibboost_math_tr1f.a\t\t\tlibncurses++.a\r\nlibboost_math_tr1f.dylib\t\tlibncurses++w.a\r\nlibboost_math_tr1l.a\t\t\tlibncurses.6.dylib\r\nlibboost_math_tr1l.dylib\t\tlibncurses.a\r\nlibboost_prg_exec_monitor.a\t\tlibncurses.dylib\r\nlibboost_prg_exec_monitor.dylib\t\tlibncursesw.6.dylib\r\nlibboost_program_options.a\t\tlibncursesw.a\r\nlibboost_program_options.dylib\t\tlibncursesw.dylib\r\nlibboost_random.a\t\t\tlibnetcdf.13.dylib\r\nlibboost_random.dylib\t\t\tlibnetcdf.a\r\nlibboost_regex.a\t\t\tlibnetcdf.dylib\r\nlibboost_regex.dylib\t\t\tlibnetcdf.settings\r\nlibboost_serialization.a\t\tlibopenblas.0.dylib\r\nlibboost_serialization.dylib\t\tlibopenblasp-r0.3.6.dylib\r\nlibboost_stacktrace_addr2line.a\t\tlibopenjp2.2.3.1.dylib\r\nlibboost_stacktrace_addr2line.dylib\tlibopenjp2.7.dylib\r\nlibboost_stacktrace_basic.a\t\tlibopenjp2.a\r\nlibboost_stacktrace_basic.dylib\t\tlibopenjp2.dylib\r\nlibboost_stacktrace_noop.a\t\tlibpanel.6.dylib\r\nlibboost_stacktrace_noop.dylib\t\tlibpanel.a\r\nlibboost_system.a\t\t\tlibpanel.dylib\r\nlibboost_system.dylib\t\t\tlibpanelw.6.dylib\r\nlibboost_test_exec_monitor.a\t\tlibpanelw.a\r\nlibboost_thread.a\t\t\tlibpanelw.dylib\r\nlibboost_thread.dylib\t\t\tlibpcre.1.dylib\r\nlibboost_timer.a\t\t\tlibpcre.a\r\nlibboost_timer.dylib\t\t\tlibpcre.dylib\r\nlibboost_type_erasure.a\t\t\tlibpcrecpp.0.dylib\r\nlibboost_type_erasure.dylib\t\tlibpcrecpp.a\r\nlibboost_unit_test_framework.a\t\tlibpcrecpp.dylib\r\nlibboost_unit_test_framework.dylib\tlibpcreposix.0.dylib\r\nlibboost_wave.a\t\t\t\tlibpcreposix.a\r\nlibboost_wave.dylib\t\t\tlibpcreposix.dylib\r\nlibboost_wserialization.a\t\tlibpgcommon.a\r\nlibboost_wserialization.dylib\t\tlibpgfeutils.a\r\nlibbz2.1.0.8.dylib\t\t\tlibpgport.a\r\nlibbz2.a\t\t\t\tlibpgtypes.3.11.dylib\r\nlibbz2.dylib\t\t\t\tlibpgtypes.3.dylib\r\nlibc++.1.0.dylib\t\t\tlibpgtypes.a\r\nlibc++.1.dylib\t\t\t\tlibpgtypes.dylib\r\nlibc++.a\t\t\t\tlibpixman-1.0.dylib\r\nlibc++.dylib\t\t\t\tlibpixman-1.a\r\nlibc++abi.1.0.dylib\t\t\tlibpixman-1.dylib\r\nlibc++abi.1.dylib\t\t\tlibpng.a\r\nlibc++abi.a\t\t\t\tlibpng.dylib\r\nlibc++abi.dylib\t\t\t\tlibpng16.16.dylib\r\nlibc++experimental.a\t\t\tlibpng16.a\r\nlibc++fs.a\t\t\t\tlibpng16.dylib\r\nlibcairo-gobject.2.dylib\t\tlibpoppler-cpp.0.5.0.dylib\r\nlibcairo-gobject.a\t\t\tlibpoppler-cpp.0.dylib\r\nlibcairo-gobject.dylib\t\t\tlibpoppler-cpp.dylib\r\nlibcairo-script-interpreter.2.dylib\tlibpoppler-glib.8.9.0.dylib\r\nlibcairo-script-interpreter.a\t\tlibpoppler-glib.8.dylib\r\nlibcairo-script-interpreter.dylib\tlibpoppler-glib.dylib\r\nlibcairo.2.dylib\t\t\tlibpoppler.78.0.0.dylib\r\nlibcairo.a\t\t\t\tlibpoppler.78.dylib\r\nlibcairo.dylib\t\t\t\tlibpoppler.dylib\r\nlibcblas.3.dylib\t\t\tlibpq.5.11.dylib\r\nlibcblas.dylib\t\t\t\tlibpq.5.dylib\r\nlibcfitsio.8.3.47.dylib\t\t\tlibpq.a\r\nlibcfitsio.8.dylib\t\t\tlibpq.dylib\r\nlibcfitsio.a\t\t\t\tlibpqwalreceiver.so\r\nlibcfitsio.dylib\t\t\tlibproj.15.dylib\r\nlibcharset.1.dylib\t\t\tlibproj.a\r\nlibcharset.a\t\t\t\tlibproj.dylib\r\nlibcharset.dylib\t\t\tlibpython3.7m.a\r\nlibcom_err.3.0.dylib\t\t\tlibpython3.7m.dylib\r\nlibcom_err.3.dylib\t\t\tlibquadmath.0.dylib\r\nlibcom_err.dylib\t\t\tlibquadmath.dylib\r\nlibcrypto.1.1.dylib\t\t\tlibreadline.8.0.dylib\r\nlibcrypto.dylib\t\t\t\tlibreadline.8.dylib\r\nlibcurl.4.dylib\t\t\t\tlibreadline.a\r\nlibcurl.a\t\t\t\tlibreadline.dylib\r\nlibcurl.dylib\t\t\t\tlibspatialindex.5.0.0.dylib\r\nlibdap.25.dylib\t\t\t\tlibspatialindex.5.dylib\r\nlibdap.a\t\t\t\tlibspatialindex.dylib\r\nlibdap.dylib\t\t\t\tlibspatialindex_c.5.0.0.dylib\r\nlibdapclient.6.dylib\t\t\tlibspatialindex_c.5.dylib\r\nlibdapclient.a\t\t\t\tlibspatialindex_c.dylib\r\nlibdapclient.dylib\t\t\tlibspatialite.7.dylib\r\nlibdapserver.7.dylib\t\t\tlibspatialite.dylib\r\nlibdapserver.a\t\t\t\tlibsqlite3.0.dylib\r\nlibdapserver.dylib\t\t\tlibsqlite3.a\r\nlibdf.0.dylib\t\t\t\tlibsqlite3.dylib\r\nlibdf.a\t\t\t\t\tlibssh2.1.0.1.dylib\r\nlibdf.dylib\t\t\t\tlibssh2.1.dylib\r\nlibecpg.6.11.dylib\t\t\tlibssh2.a\r\nlibecpg.6.dylib\t\t\t\tlibssh2.dylib\r\nlibecpg.a\t\t\t\tlibssl.1.1.dylib\r\nlibecpg.dylib\t\t\t\tlibssl.dylib\r\nlibecpg_compat.3.11.dylib\t\tlibtcl8.6.dylib\r\nlibecpg_compat.3.dylib\t\t\tlibtclstub8.6.a\r\nlibecpg_compat.a\t\t\tlibtest-types.a\r\nlibecpg_compat.dylib\t\t\tlibtiff.5.dylib\r\nlibedit.0.dylib\t\t\t\tlibtiff.a\r\nlibedit.a\t\t\t\tlibtiff.dylib\r\nlibedit.dylib\t\t\t\tlibtiffxx.5.dylib\r\nlibexpat.1.6.7.dylib\t\t\tlibtiffxx.a\r\nlibexpat.1.dylib\t\t\tlibtiffxx.dylib\r\nlibexpat.a\t\t\t\tlibtinfo.6.dylib\r\nlibexpat.dylib\t\t\t\tlibtinfo.a\r\nlibffi.6.dylib\t\t\t\tlibtinfo.dylib\r\nlibffi.a\t\t\t\tlibtinfow.6.dylib\r\nlibffi.dylib\t\t\t\tlibtinfow.a\r\nlibfontconfig.1.dylib\t\t\tlibtinfow.dylib\r\nlibfontconfig.a\t\t\t\tlibtk8.6.dylib\r\nlibfontconfig.dylib\t\t\tlibtkstub8.6.a\r\nlibform.6.dylib\t\t\t\tlibtz.a\r\nlibform.a\t\t\t\tliburiparser.dylib\r\nlibform.dylib\t\t\t\tlibverto.0.0.dylib\r\nlibformw.6.dylib\t\t\tlibverto.0.dylib\r\nlibformw.a\t\t\t\tlibverto.dylib\r\nlibformw.dylib\t\t\t\tlibxerces-c-3.2.dylib\r\nlibfreetype.6.dylib\t\t\tlibxerces-c.dylib\r\nlibfreetype.a\t\t\t\tlibxml2.2.dylib\r\nlibfreetype.dylib\t\t\tlibxml2.dylib\r\nlibfreexl.1.dylib\t\t\tlibz.1.2.11.dylib\r\nlibfreexl.a\t\t\t\tlibz.1.dylib\r\nlibfreexl.dylib\t\t\t\tlibz.a\r\nlibgcc_s.1.dylib\t\t\tlibz.dylib\r\nlibgcc_s_ppc64.1.dylib\t\t\tlibzstd.1.4.0.dylib\r\nlibgcc_s_x86_64.1.dylib\t\t\tlibzstd.1.dylib\r\nlibgdal.20.dylib\t\t\tlibzstd.a\r\nlibgdal.a\t\t\t\tlibzstd.dylib\r\nlibgdal.dylib\t\t\t\tlo.so\r\nlibgeos-3.7.2.dylib\t\t\tltree.so\r\nlibgeos.dylib\t\t\t\tmod_spatialite.7.so\r\nlibgeos_c.1.dylib\t\t\tmod_spatialite.so\r\nlibgeos_c.dylib\t\t\t\tmoddatetime.so\r\nlibgeotiff.5.0.0.dylib\t\t\topenjpeg-2.3\r\nlibgeotiff.5.dylib\t\t\tpageinspect.so\r\nlibgeotiff.dylib\t\t\tpasswordcheck.so\r\nlibgettextlib-0.19.8.1.dylib\t\tpg_buffercache.so\r\nlibgettextlib.dylib\t\t\tpg_freespacemap.so\r\nlibgettextpo.0.dylib\t\t\tpg_prewarm.so\r\nlibgettextpo.a\t\t\t\tpg_stat_statements.so\r\nlibgettextpo.dylib\t\t\tpg_trgm.so\r\nlibgettextsrc-0.19.8.1.dylib\t\tpg_visibility.so\r\nlibgettextsrc.dylib\t\t\tpgcrypto.so\r\nlibgfortran.3.dylib\t\t\tpgoutput.so\r\nlibgfortran.dylib\t\t\tpgrowlocks.so\r\nlibgif.a\t\t\t\tpgstattuple.so\r\nlibgif.so\t\t\t\tpgxml.so\r\nlibgif.so.7\t\t\t\tpgxs\r\nlibgif.so.7.1.0\t\t\t\tpkgconfig\r\nlibgio-2.0.0.dylib\t\t\tplpgsql.so\r\nlibgio-2.0.dylib\t\t\tpostgres_fdw.so\r\nlibglib-2.0.0.dylib\t\t\tpython3.7\r\nlibglib-2.0.dylib\t\t\trefint.so\r\nlibgmodule-2.0.0.dylib\t\t\tseg.so\r\nlibgmodule-2.0.dylib\t\t\tsqlite3.25.3\r\nlibgobject-2.0.0.dylib\t\t\tsslinfo.so\r\nlibgobject-2.0.dylib\t\t\ttablefunc.so\r\nlibgomp.1.dylib\t\t\t\ttcl8\r\nlibgomp.dylib\t\t\t\ttcl8.6\r\nlibgssapi_krb5.2.2.dylib\t\ttclConfig.sh\r\nlibgssapi_krb5.2.dylib\t\t\ttclooConfig.sh\r\nlibgssapi_krb5.dylib\t\t\ttcn.so\r\nlibgssrpc.4.2.dylib\t\t\ttdbc1.1.0\r\nlibgssrpc.4.dylib\t\t\ttdbcmysql1.1.0\r\nlibgssrpc.dylib\t\t\t\ttdbcodbc1.1.0\r\nlibgthread-2.0.0.dylib\t\t\ttdbcpostgres1.1.0\r\nlibgthread-2.0.dylib\t\t\tterminfo\r\nlibhdf4.settings\t\t\tterminfo.c~\r\nlibhdf5.103.dylib\t\t\ttest_decoding.so\r\nlibhdf5.a\t\t\t\tthread2.8.4\r\nlibhdf5.dylib\t\t\t\ttimetravel.so\r\nlibhdf5.settings\t\t\ttk8.6\r\nlibhdf5_cpp.103.dylib\t\t\ttkConfig.sh\r\nlibhdf5_cpp.a\t\t\t\ttsm_system_rows.so\r\nlibhdf5_cpp.dylib\t\t\ttsm_system_time.so\r\nlibhdf5_fortran.a\t\t\tunaccent.so\r\nlibhdf5_hl.100.dylib\t\t\tutf8_and_ascii.so\r\nlibhdf5_hl.a\t\t\t\tutf8_and_big5.so\r\nlibhdf5_hl.dylib\t\t\tutf8_and_cyrillic.so\r\nlibhdf5_hl_cpp.100.dylib\t\tutf8_and_euc2004.so\r\nlibhdf5_hl_cpp.a\t\t\tutf8_and_euc_cn.so\r\nlibhdf5_hl_cpp.dylib\t\t\tutf8_and_euc_jp.so\r\nlibhdf5_hl_fortran.a\t\t\tutf8_and_euc_kr.so\r\nlibhdf5hl_fortran.a\t\t\tutf8_and_euc_tw.so\r\nlibhistory.8.0.dylib\t\t\tutf8_and_gb18030.so\r\nlibhistory.8.dylib\t\t\tutf8_and_gbk.so\r\nlibhistory.a\t\t\t\tutf8_and_iso8859.so\r\nlibhistory.dylib\t\t\tutf8_and_iso8859_1.so\r\nlibiconv.2.dylib\t\t\tutf8_and_johab.so\r\nlibiconv.a\t\t\t\tutf8_and_sjis.so\r\nlibiconv.dylib\t\t\t\tutf8_and_sjis2004.so\r\nlibicudata.64.2.dylib\t\t\tutf8_and_uhc.so\r\nlibicudata.64.dylib\t\t\tutf8_and_win.so\r\nlibicudata.dylib\t\t\tuuid-ossp.so\r\nlibicui18n.64.2.dylib\t\t\txml2Conf.sh\r\nlibicui18n.64.dylib\r\n```\r\nI created a new environment and set conda-forge to priority and set this `conda config --set channel_priority strict`. I am not sure why this is still happening.\r\n\r\nI also made sure that there were no other geos versions installed in my system.\r\n\r\nIt seems that a lot of osx users come across this problem.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Docker image to run cucumber - js","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# LICENSE file is missing\n\nmeson.build and README.md define MIT but MIT has [many styles](https://fedoraproject.org/wiki/Licensing:MIT).","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Contribution guide\n\nHey, thank you for a great job.\r\nCan you please provide a simple contribution guide and docs for development setup?\r\nThey will help a lot as a starting point.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"[Support] Need example of returning a list of objects; maybe a bug?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Property 'error' does not exist on type 'void | FetchResult<MutationResponse, Record<string, any>, Record<string, any>>\n\nI'm trying to explicitly type the `mutate` function returned by the graphql higher order component, but I'm running into issues trying to destructure the `error` property from the return value of a call to that function. My original code was this, based on the documentation on [render-prop-function](https://www.apollographql.com/docs/react/api/react-components/#render-prop-function-1):\r\n\r\n`const { error } = await mutate({\r\n variables: {\r\n input: mutationValues\r\n }\r\n });`\r\n\r\nBut that generated the compile error in the title of this issue. The only way I could get this to compile was the following:\r\n\r\n`const mutationResult = await mutate({\r\n variables: {\r\n input: mutationValues\r\n }\r\n });\r\nconst { errors } = mutationResult as ExecutionResult;`\r\n\r\nIs there a way to get the `error` or `errors` property from the result of the call to `mutate` without casting `mutationResult` as type `ExecutionResult`? There must be something I'm missing here. Using react-apollo version 2.5.8.\r\n\r\nEDIT: I should note that the above applies to a `mutate` function that I have typed as `MutationFn<MyMutation, MyMutationVariables>`. I freely admit this may be the wrong type, but I'm not sure how else to type this function. If I type it as `any` then the very first line of code above with the destructured `error` property works, but I'd rather not resort to the `any` type just to get this to work.\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Mode not preserved when uploading files\n\nThe [buildbot documentation](http://docs.buildbot.net/latest/manual/configuration/buildsteps.html#transferring-files) says this: \r\n\r\n> The copied file will have the same permissions on the master as on the worker, look at the mode= parameter to set it differently.\r\n\r\nHowever, as far as I can tell, that's not true:\r\n\r\n1. Experimentally, if I have a file with 0o444 (all read-only) permissions and upload it with the step `steps.FileUpload(workersrc='build/artifact.main.c.log', masterdest='myfile.log')`, it appears on master with 0o600 (owner read-write) permissions.\r\n2. The transfer file code appears to have no way of reading the original mode and transfering it as part of the file transfer, which would make it hard to set properly.\r\n3. The `FileWriter` class only attempts to set mode on the file if its mode property is set, which seems to correspond to passing the mode option into the FileUpload step. \r\n4. At a glance, most (all?) of the tests that validate file transfers do not validate the mode when the option isn't passed.\r\n\r\nAm I missing something tricky for how mode should be getting set by default? It's a great idea to default the permissions to be the same. \r\n\r\nFor context, I'm running buildbot worker and master at 2.3.1 on macOS. ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Why not add another formula for selective system python shadowing?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"nextcloud 12 occ gallery don't exist. this is lie application is installed and worked","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Document `Error` comparison algo","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"/bin/bash: no such file or directory","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Problem: sharing stack with closures prevents typing","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Windows installation completes but won't start","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# [REVIEW]: A course on the implicit finite volume method for CFD using Python\n\n**Submitting author:** @ctdegroot (<a href=\"http://orcid.org/0000-0002-2069-8253\">Christopher DeGroot</a>)\n**Repository:** <a href=\"https://bitbucket.org/cdegroot/cfdcourse/\" target =\"_blank\">https://bitbucket.org/cdegroot/cfdcourse/</a>\n**Version:** v1.0\n**Editor:** @IanHawke\n**Reviewer:** @sconde, @zingale\n**Archive:** Pending\n\n## Status\n\n[](http://jose.theoj.org/papers/2f41f8e108eac1c0bf482f1fa9968008)\n\nStatus badge code:\n\n```\nHTML: <a href=\"http://jose.theoj.org/papers/2f41f8e108eac1c0bf482f1fa9968008\"><img src=\"http://jose.theoj.org/papers/2f41f8e108eac1c0bf482f1fa9968008/status.svg\"></a>\nMarkdown: [](http://jose.theoj.org/papers/2f41f8e108eac1c0bf482f1fa9968008)\n```\n**Reviewers and authors**:\n\nPlease avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the <a href=\"https://bitbucket.org/cdegroot/cfdcourse/\" target=\"_blank\">target repository</a> and link to those issues (especially acceptance-blockers) in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)\n\n## Reviewer instructions & questions\n\n@sconde & @zingale, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:\n1. Make sure you're logged in to your GitHub account\n2. Be sure to accept the invite at this URL: https://github.com/openjournals/jose-reviews/invitations\n\nThe reviewer guidelines are available here: https://jose.theoj.org/about#reviewer_guidelines. Any questions/concerns please let @IanHawke know.\n\n## Review checklist for @sconde\n\n### Conflict of interest\n\n- [ ] As the reviewer I confirm that I have read the [JOSE conflict of interest policy](https://github.com/openjournals/jose/blob/master/COI.md) and that there are no conflicts of interest for me to review this work.\n\n### Code of Conduct\n\n- [ ] I confirm that I read and will adhere to the [JOSE code of conduct](https://jose.theoj.org/about#code_of_conduct).\n\n### General checks\n\n- [ ] **Repository:** Is the source for this learning module available at the <a target=\"_blank\" href=\"https://bitbucket.org/cdegroot/cfdcourse/\">repository url</a>?\n- [ ] **License:** Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)\n- [ ] **Version:** Does the release version given match the repository release (v1.0)?\n- [ ] **Authorship:** Has the submitting author (@ctdegroot) made visible contributions to the module? Does the full list of authors seem appropriate and complete?\n\n### Documentation\n\n- [ ] **A statement of need:** Do the authors clearly state the need for this module and who the target audience is?\n- [ ] **Installation instructions:** Is there a clearly stated list of dependencies?\n- [ ] **Usage:** Does the documentation explain how someone would adopt the module, and include examples of how to use it?\n- [ ] **Community guidelines:** Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support\n\n### Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)\n\n- [ ] **Learning objectives:** Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)\n- [ ] **Content scope and length:** Is the content substantial for learning a given topic? Is the length of the module appropriate?\n- [ ] **Pedagogy:** Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)\n- [ ] **Content quality:** Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?\n- [ ] **Instructional design:** Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.\n\n### JOSE paper\n\n- [ ] **Authors:** Does the `paper.md` file include a list of authors with their affiliations?\n- [ ] **A statement of need:** Does the paper clearly state the need for this module and who the target audience is?\n- [ ] **Description:** Does the paper describe the learning materials and sequence?\n- [ ] Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?\n- [ ] Could someone else teach with this module, given the right expertise?\n- [ ] Does the paper tell the \"story\" of how the authors came to develop it, or what their expertise is?\n- [ ] **References:** Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?\n\n\n## Review checklist for @zingale\n\n### Conflict of interest\n\n- [ ] As the reviewer I confirm that I have read the [JOSE conflict of interest policy](https://github.com/openjournals/jose/blob/master/COI.md) and that there are no conflicts of interest for me to review this work.\n\n### Code of Conduct\n\n- [ ] I confirm that I read and will adhere to the [JOSE code of conduct](https://jose.theoj.org/about#code_of_conduct).\n\n### General checks\n\n- [ ] **Repository:** Is the source for this learning module available at the <a target=\"_blank\" href=\"https://bitbucket.org/cdegroot/cfdcourse/\">repository url</a>?\n- [ ] **License:** Does the repository contain a plain-text LICENSE file with the contents of a standard license? (OSI-approved for code, Creative Commons for content)\n- [ ] **Version:** Does the release version given match the repository release (v1.0)?\n- [ ] **Authorship:** Has the submitting author (@ctdegroot) made visible contributions to the module? Does the full list of authors seem appropriate and complete?\n\n### Documentation\n\n- [ ] **A statement of need:** Do the authors clearly state the need for this module and who the target audience is?\n- [ ] **Installation instructions:** Is there a clearly stated list of dependencies?\n- [ ] **Usage:** Does the documentation explain how someone would adopt the module, and include examples of how to use it?\n- [ ] **Community guidelines:** Are there clear guidelines for third parties wishing to 1) Contribute to the module 2) Report issues or problems with the module 3) Seek support\n\n### Pedagogy / Instructional design (Work-in-progress: reviewers, please comment!)\n\n- [ ] **Learning objectives:** Does the module make the learning objectives plainly clear? (We don't require explicitly written learning objectives; only that they be evident from content and design.)\n- [ ] **Content scope and length:** Is the content substantial for learning a given topic? Is the length of the module appropriate?\n- [ ] **Pedagogy:** Does the module seem easy to follow? Does it observe guidance on cognitive load? (working memory limits of 7 +/- 2 chunks of information)\n- [ ] **Content quality:** Is the writing of good quality, concise, engaging? Are the code components well crafted? Does the module seem complete?\n- [ ] **Instructional design:** Is the instructional design deliberate and apparent? For example, exploit worked-example effects; effective multi-media use; low extraneous cognitive load.\n\n### JOSE paper\n\n- [ ] **Authors:** Does the `paper.md` file include a list of authors with their affiliations?\n- [ ] **A statement of need:** Does the paper clearly state the need for this module and who the target audience is?\n- [ ] **Description:** Does the paper describe the learning materials and sequence?\n- [ ] Does it describe how it has been used in the classroom or other settings, and how someone might adopt it?\n- [ ] Could someone else teach with this module, given the right expertise?\n- [ ] Does the paper tell the \"story\" of how the authors came to develop it, or what their expertise is?\n- [ ] **References:** Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?\n\n\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Hide \"Add\" for stacks that already have instructions","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Vec::with_capacity does not work correctly for zero-sized types","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Missing dependencies in my create-react-app project\n\nwhen I tape `yarn check` in my react.js project it appears this error\r\n```\r\nyarn check v1.16.0\r\ninfo [email protected]: The platform \"linux\" is incompatible with this module.\r\ninfo \"[email protected]\" is an optional dependency and failed compatibility check. Excluding it from installation.\r\ninfo [email protected]: The platform \"linux\" is incompatible with this module.\r\ninfo \"[email protected]\" is an optional dependency and failed compatibility check. Excluding it from installation.\r\nwarning \"react-scripts#babel-jest@^24.8.0\" could be deduped from \"24.8.0\" to \"[email protected]\"\r\nwarning \"react-scripts#babel-preset-react-app#@babel/[email protected]\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-preset-react-app#@babel/[email protected]\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-preset-react-app#@babel/[email protected]\" could be deduped from \"7.0.0\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-jest#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-loader#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-plugin-named-asset-import#@babel/core@^7.1.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"jest-resolve#jest-pnp-resolver#jest-resolve@*\" could be deduped from \"24.8.0\" to \"[email protected]\"\r\nwarning \"webpack#chrome-trace-event#tslib@^1.9.0\" could be deduped from \"1.10.0\" to \"[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-class-properties#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-decorators#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-object-rest-spread#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-syntax-dynamic-import#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-classes#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-destructuring#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-flow-strip-types#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-react-display-name#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-runtime#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-async-generator-functions@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-json-strings@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-optional-catch-binding@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-unicode-property-regex@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-syntax-async-generators@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-syntax-optional-catch-binding@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-arrow-functions@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-async-to-generator@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-block-scoped-functions@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-block-scoping@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-computed-properties@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-dotall-regex@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-duplicate-keys@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-exponentiation-operator@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-for-of@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-function-name@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-literals@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-member-expression-literals@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-modules-amd@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-modules-commonjs@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-modules-systemjs@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-modules-umd@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-named-capturing-groups-regex@^7.4.2\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-new-target@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-object-super@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-parameters@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-property-literals@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-regenerator@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-reserved-words@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-spread@^7.2.0\" could be deduped from \"7.2.2\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-template-literals@^7.2.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-typeof-symbol@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-unicode-regex@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-display-name@^7.0.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx@^7.0.0\" could be deduped from \"7.3.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx-self@^7.0.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx-source@^7.0.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-typescript#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-jest#babel-preset-jest#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"@typescript-eslint/eslint-plugin#tsutils#tslib@^1.8.1\" could be deduped from \"1.10.0\" to \"[email protected]\"\r\nwarning \"eslint#inquirer#rxjs#tslib@^1.9.0\" could be deduped from \"1.10.0\" to \"[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-class-properties#@babel/helper-create-class-features-plugin#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-async-generator-functions#@babel/plugin-syntax-async-generators@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-decorators#@babel/plugin-syntax-decorators#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-flow-strip-types#@babel/plugin-syntax-flow#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"@babel/preset-react#@babel/plugin-transform-react-display-name#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nerror \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx-self\" not installed\r\nerror \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx-source\" not installed\r\nwarning \"babel-preset-react-app#@babel/preset-typescript#@babel/plugin-transform-typescript#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-typescript#@babel/plugin-syntax-typescript#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"jest-config#babel-jest#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nerror \"babel-jest#babel-preset-jest\" not installed\r\ninfo Found 68 warnings.\r\nerror Found 3 errors.\r\ninfo Visit https://yarnpkg.com/en/docs/cli/check for documentation about this command.\r\n```\r\nI attempted to fix it with `yarn install` but nothing change and show that they all up-to-date\r\n```\r\nyarn install v1.16.0\r\n[1/4] Resolving packages...\r\nsuccess Already up-to-date.\r\nDone in 0.85s.\r\n```\r\nI can't understand what's the reason that causes this and how to fix it.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CRA 3.1.0 - webpack unknown rule \n\nAfter upgrading CRA to 3.1.0, build stop with next exception:\r\n\r\ng:\\source\\simpleui\\src\\node_modules\\@craco\\craco\\lib\\plugin-utils.js:29\r\n throw new Error(\r\n ^\r\n\r\nError: Found an unhandled loader in the development webpack config: g:\\source\\simpleui\\src\\node_modules\\resolve-url-loader\\index.js\r\n\r\nThis error probably occurred because you updated react-scripts or craco. Please try updating craco-less to the latest version:\r\n\r\n $ yarn upgrade craco-less\r\n\r\nOr:\r\n\r\n $ npm update craco-less\r\n\r\nIf that doesn't work, craco-less needs to be fixed to support the latest version.\r\nPlease check to see if there's already an issue in the FormAPI/craco-less repo:\r\n\r\n * https://github.com/FormAPI/craco-less/issues?q=is%3Aissue+webpack+unknown+rule\r\n\r\nIf not, please open an issue and we'll take a look. (Or you can send a PR!)\r\n\r\nYou might also want to look for related issues in the craco and create-react-app repos:\r\n\r\n * https://github.com/sharegate/craco/issues?q=is%3Aissue+webpack+unknown+rule\r\n * https://github.com/facebook/create-react-app/issues?q=is%3Aissue+webpack+unknown+rule\r\n\r\n at throwUnexpectedConfigError (g:\\source\\simpleui\\src\\node_modules\\@craco\\craco\\lib\\plugin-utils.js:29:11)\r\n at throwError (g:\\source\\simpleui\\src\\node_modules\\craco-less\\lib\\craco-less.js:14:5)\r\n at loaders.forEach.ruleOrLoader (g:\\source\\simpleui\\src\\node_modules\\craco-less\\lib\\craco-less.js:112:7)\r\n at Array.forEach (<anonymous>)\r\n at Object.overrideWebpackConfig (g:\\source\\simpleui\\src\\node_modules\\craco-less\\lib\\craco-less.js:52:11)\r\n at overrideWebpack (g:\\source\\simpleui\\src\\node_modules\\@craco\\craco\\lib\\features\\plugins.js:40:40)\r\n at cracoConfig.plugins.forEach.x (g:\\source\\simpleui\\src\\node_modules\\@craco\\craco\\lib\\features\\plugins.js:60:29)\r\n at Array.forEach (<anonymous>)\r\n at applyWebpackConfigPlugins (g:\\source\\simpleui\\src\\node_modules\\@craco\\craco\\lib\\features\\plugins.js:59:29)\r\n at overrideWebpack (g:\\source\\simpleui\\src\\node_modules\\@craco\\craco\\lib\\features\\webpack.js:65:21)\r\n at Object.<anonymous> (g:\\source\\simpleui\\src\\node_modules\\@craco\\craco\\scripts\\start.js:21:1)\r\n at Module._compile (internal/modules/cjs/loader.js:776:30)\r\n at Object.Module._extensions..js (internal/modules/cjs/loader.js:787:10)\r\n at Module.load (internal/modules/cjs/loader.js:653:32)\r\n at tryModuleLoad (internal/modules/cjs/loader.js:593:12)\r\n at Function.Module._load (internal/modules/cjs/loader.js:585:3)\r\nerror Command failed with exit code 1.\r\ninfo Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.\r\nERROR: \"start\" exited with 1.\r\nerror Command failed with exit code 1.\r\ninfo Visit https://yarnpkg.com/en/docs/cli/run for documentation about this command.\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# [REQUEST] CCE resource to obtain certificates\n\nHi there,\r\n\r\ncurrently I have a shell script which will download the KUBECONFIG for every server but it would be nice to replace this with terraform.\r\n\r\nAccording to the API it is possible to obtain those:\r\nhttps://docs.otc.t-systems.com/en-us/api2/cce/cce_02_0249.html\r\n\r\n### Affected Resource(s)\r\n- opentelekomcloud_cce_certificate\r\n\r\n### Terraform Configuration Files\r\nAs Data, which will cause a problem since the certificiate is not existing till the cce-cluster is created.\r\n```hcl\r\ndata \"opentelekomcloud_cce_certificate\" \"my_cce_certs\" {\r\n project_id = \"\"${var.my_project_id}\"\r\n cluster_id = \"${var.my_cce_cluster_id}\"\r\n}\r\n```\r\nOr as resource, which can then `depends_on` something\r\n```hcl\r\nresource \"opentelekomcloud_cce_certificate\" \"my_cce_certs\" {\r\n project_id = \"\"${var.my_project_id}\"\r\n cluster_id = \"${var.my_cce_cluster_id}\"\r\n\r\n depends_on = [my_cce_cluster_creation]\r\n}\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Misunderstanding about the connection between the HM-11 et Myo","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# I can make this example better: run \"/bin/sh\" \"-c\" \"echo ${title} > /tmp/playing\"\n\n### mpv version and platform\r\ngit master, i. e. 639ee55df7cc1ecf7ea5dcfa7ecc5551b6b7312d\r\n\r\nDOCS/man/input.rst says:\r\n```\r\n ``run \"/bin/sh\" \"-c\" \"echo ${title} > /tmp/playing\"``\r\n\r\n This is not a particularly good example, because it doesn't handle\r\n escaping, and a specially prepared file might allow an attacker to\r\n execute arbitrary shell commands. It is recommended to write a small\r\n shell script, and call that with ``run``.\r\n```\r\nI can fix security problem (if I understand mpv config syntax and mpv rules for expanding `${title}` correctly):\r\n```\r\nrun \"/bin/sh\" \"-c\" \"printf '%s\\\\n' \\\"$1\\\" > /tmp/playing\" dummy-argv0 \"${title}\"\r\n```\r\nPlease, don't trust me! I know how UNIX shell works, so I know that from shell view point code above is correct. But I don't know all details about your mpv syntax, so I could make some mistake. Also, I didn't test code above","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Decide what the version granularity of the docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Query regarding interrupt handling in OPTEE","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# PICO-D4 Core 0 panic'ed (LoadProhibited) (IDFGH-1666)\n\n----------------------------- Delete below -----------------------------\r\n\r\nIf your issue is a general question, starts similar to \"How do I..\", or is related to 3rd party development kits/libs, please discuss this on our community forum at esp32.com instead.\r\n\r\nINSTRUCTIONS\r\n============\r\n\r\nBefore submitting a new issue, please follow the checklist and try to find the answer.\r\n\r\n- [X ] I have read the documentation [ESP-IDF Programming Guide](https://docs.espressif.com/projects/esp-idf/en/latest/) and the issue is not addressed there.\r\n- [X ] I have updated my IDF branch (master or release) to the latest version and checked that the issue is present there.\r\n- [X ] I have searched the issue tracker for a similar issue and not found a similar issue.\r\n\r\nIf the issue cannot be solved after the steps before, please follow these instructions so we can get the needed information to help you in a quick and effective fashion.\r\n\r\n1. Fill in all the fields under **Environment** marked with [ ] by picking the correct option for you in each case and deleting the others.\r\n2. Describe your problem.\r\n3. Include [debug logs on the monitor](https://docs.espressif.com/projects/esp-idf/en/latest/get-started/idf-monitor.html#automatically-decoding-addresses) or the [coredump](https://docs.espressif.com/projects/esp-idf/en/latest/api-guides/core_dump.html).\r\n4. Provide more items under **Other items if possible** can help us better locate your problem.\r\n5. Use markup (buttons above) and the Preview tab to check what the issue will look like.\r\n6. Delete these instructions from the above to the below marker lines before submitting this issue.\r\n\r\n----------------------------- Delete above -----------------------------\r\n\r\n## Environment\r\n\r\n- Development Kit: [none]\r\n- Kit version (for WroverKit/PicoKit/DevKitC): [v1|v2|v3|v4]\r\n- Module or chip used: [|ESP32-PICO-D4]\r\n- IDF version (run ``git describe --tags`` to find it): \r\n // v3.2-dev-1148-g96cd3b75c\r\n- Build System: [Make|CMake]\r\n- Compiler version (run ``xtensa-esp32-elf-gcc --version`` to find it):\r\n // 1.22.0-80-g6c4433a\r\n- Operating System: [Windows]\r\n- Power Supply: [USB|Battery] \r\n\r\n## Problem Description\r\nI have tested the code on an ESP32-WROOM-32 and works fine however the intended target is a PICO-D4. I am getting the following error imediatly upon startup.\r\n\r\n`ets Jun 8 2016 00:22:57\r\n\r\nrst:0x9 (RTCWDT_SYS_RESET),boot:0x13 (SPI_FAST_FLASH_BOOT)\r\nconfigsip: 188777542, SPIWP:0xee\r\nclk_drv:0x00,q_drv:0x00,d_drv:0x00,cs0_drv:0x00,hd_drv:0x00,wp_drv:0x00\r\nmode:DIO, clock div:2\r\nload:0x3fff0018,len:4\r\nload:0x3fff001c,len:6248\r\nload:0x40078000,len:10272\r\nho 0 tail 12 room 4\r\nload:0x40080400,len:6608\r\nentry 0x40080764\r\nW (63) boot: PRO CPU has been reset by WDT.\r\nW (63) boot: WDT reset info: PRO CPU PC=0x400621ce\r\nW (63) boot: WDT reset info: APP CPU PC=0xbf43c4ca\r\nI (69) boot: ESP-IDF v3.2.2-42-g4a9f33944 2nd stage bootloader\r\nI (76) boot: compile time 10:38:12\r\nI (80) boot: Enabling RNG early entropy source...\r\nI (85) boot: SPI Speed : 40MHz\r\nI (89) boot: SPI Mode : DIO\r\nI (93) boot: SPI Flash Size : 4MB\r\nI (97) boot: Partition Table:\r\nI (101) boot: ## Label Usage Type ST Offset Length\r\nI (108) boot: 0 nvs WiFi data 01 02 00009000 00004000\r\nI (116) boot: 1 otadata OTA data 01 00 0000d000 00002000\r\nI (123) boot: 2 phy_init RF data 01 01 0000f000 00001000\r\nI (131) boot: 3 factory factory app 00 00 00010000 00140000\r\nI (138) boot: 4 ota_0 OTA app 00 10 00150000 00140000\r\nI (146) boot: 5 ota_1 OTA app 00 11 00290000 00140000\r\nI (154) boot: 6 coredump Unknown data 01 03 003d0000 00010000\r\nI (161) boot: 7 reserved Unknown data 01 fe 003e0000 00020000\r\nI (169) boot: End of partition table\r\nI (173) boot: Defaulting to factory image\r\nI (178) esp_image: segment 0: paddr=0x00010020 vaddr=0x3f400020 size=0x217b0 (137136) map\r\nI (235) esp_image: segment 1: paddr=0x000317d8 vaddr=0x3ffb0000 size=0x02bb0 ( 11184) load\r\nI (239) esp_image: segment 2: paddr=0x00034390 vaddr=0x40080000 size=0x00400 ( 1024) load\r\nI (242) esp_image: segment 3: paddr=0x00034798 vaddr=0x40080400 size=0x0b878 ( 47224) load\r\nI (270) esp_image: segment 4: paddr=0x00040018 vaddr=0x400d0018 size=0x9c368 (639848) map\r\nI (494) esp_image: segment 5: paddr=0x000dc388 vaddr=0x4008bc78 size=0x04f1c ( 20252) load\r\nI (513) boot: Loaded app from partition at offset 0x10000\r\nI (513) boot: Disabling RNG early entropy source...\r\nI (513) cpu_start: Pro cpu up.\r\nI (517) cpu_start: Single core mode\r\nI (521) heap_init: Initializing. RAM available for dynamic allocation:\r\nI (528) heap_init: At 3FFAE6E0 len 00001920 (6 KiB): DRAM\r\nI (534) heap_init: At 3FFB9988 len 00026678 (153 KiB): DRAM\r\nI (541) heap_init: At 3FFE0440 len 0001FBC0 (126 KiB): D/IRAM\r\nI (547) heap_init: At 40078000 len 00008000 (32 KiB): IRAM\r\nI (553) heap_init: At 40090B94 len 0000F46C (61 KiB): IRAM\r\nI (559) cpu_start: Pro cpu start user code\r\nI (242) cpu_start: Starting scheduler on PRO CPU.\r\nGuru Meditation Error: Core 0 panic'ed (LoadProhibited). Exception was unhandled.\r\nCore 0 register dump:\r\nPC : 0x400f0bbf PS : 0x00060030 A0 : 0x800d409d A1 : 0x3ffbd800 \r\nA2 : 0x000000c6 A3 : 0x3f41fb10 A4 : 0x00000011 A5 : 0xffffffff \r\nA6 : 0xffffffff A7 : 0x000000c7 A8 : 0x800f0bb1 A9 : 0x3ff49050 \r\nA10 : 0x3ff44570 A11 : 0x000000c6 A12 : 0x0ffd114c A13 : 0x00000000 \r\nA14 : 0x7fffffff A15 : 0x00000000 SAR : 0x00000010 EXCCAUSE: 0x0000001c \r\nEXCVADDR: 0x800d409d LBEG : 0x00000000 LEND : 0x00000000 LCOUNT : 0x00000000 \r\n\r\nBacktrace: 0x400f0bbf:0x3ffbd800 0x400d409a:0x3ffbd830 0x400881f9:0x3ffbd880`\r\n\r\n### Expected Behavior\r\n\r\nAny sugestions would be appreciated as to what I have done wrong or if i need to change a setting for compatibility\r\n\r\n### Actual Behavior\r\n\r\n### Steps to repropduce\r\n\r\n1. step1\r\n2. ...\r\n\r\n// It helps if you attach a picture of your setup/wiring here.\r\n\r\n\r\n### Code to reproduce this issue\r\n\r\nI am using the esp32mesh example with a couple of really minor changes\r\n\r\n```--main.c\r\n/*\r\n * ESPRESSIF MIT License\r\n *\r\n * Copyright (c) 2018 <ESPRESSIF SYSTEMS (SHANGHAI) PTE LTD>\r\n *\r\n * Permission is hereby granted for use on all ESPRESSIF SYSTEMS products, in which case,\r\n * it is free of charge, to any person obtaining a copy of this software and associated\r\n * documentation files (the \"Software\"), to deal in the Software without restriction, including\r\n * without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense,\r\n * and/or sell copies of the Software, and to permit persons to whom the Software is furnished\r\n * to do so, subject to the following conditions:\r\n *\r\n * The above copyright notice and this permission notice shall be included in all copies or\r\n * substantial portions of the Software.\r\n *\r\n * THE SOFTWARE IS PROVIDED \"AS IS\", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR\r\n * IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS\r\n * FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR\r\n * COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER\r\n * IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN\r\n * CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE.\r\n *\r\n */\r\n#include <stdint.h>\r\n#include <stdio.h>\r\n#include \"main.h\"\r\n#include \"mdf_common.h\"\r\n#include \"mwifi.h\"\r\n#include \"duktape.h\"\r\n\r\n#include <string.h>\r\n#include \"freertos/FreeRTOS.h\"\r\n#include \"freertos/task.h\"\r\n#include \"freertos/event_groups.h\"\r\n#include \"esp_system.h\"\r\n#include \"esp_wifi.h\"\r\n#include \"esp_event_loop.h\"\r\n#include \"esp_log.h\"\r\n#include \"nvs_flash.h\"\r\n\r\n#include \"lwip/err.h\"\r\n#include \"lwip/sys.h\"\r\n\r\n// #define MEMORY_DEBUG\r\n\r\n//#include \"lib_display.h\"\r\n#include \"lib_serial.h\"\r\n\r\nstatic int g_sockfd = -1;\r\nstatic const char *TAG = \"main\";\r\n\r\nwifi_sta_list_t wifi_sta_list = {0x0};\r\nmesh_addr_t parent_bssid = {0};\r\n\r\nvoid tcp_client_write_task(void *arg)\r\n{\r\n mdf_err_t ret = MDF_OK;\r\n char *data = MDF_CALLOC(1, MWIFI_PAYLOAD_LEN);\r\n size_t size = MWIFI_PAYLOAD_LEN;\r\n uint8_t src_addr[MWIFI_ADDR_LEN] = {0x0};\r\n mwifi_data_type_t data_type = {0x0};\r\n\r\n MDF_LOGI(\"TCP client write task is running\");\r\n\r\n while (mwifi_is_connected())\r\n {\r\n if (g_sockfd == -1)\r\n {\r\n vTaskDelay(500 / portTICK_RATE_MS);\r\n continue;\r\n }\r\n\r\n size = MWIFI_PAYLOAD_LEN - 1;\r\n memset(data, 0, MWIFI_PAYLOAD_LEN);\r\n ret = mwifi_root_read(src_addr, &data_type, data, &size, portMAX_DELAY);\r\n MDF_ERROR_CONTINUE(ret != MDF_OK, \"<%s> mwifi_root_read\", mdf_err_to_name(ret));\r\n\r\n char *json_data = NULL;\r\n int json_size = asprintf(&json_data, \"{\\\"addr\\\":\\\"\" MACSTR \"\\\",\\\"data\\\":%s}\\n\",\r\n MAC2STR(src_addr), data);\r\n\r\n MDF_LOGI(\"TCP write, size: %d, data: %s\", json_size, json_data);\r\n ret = write(g_sockfd, json_data, json_size);\r\n MDF_FREE(json_data);\r\n MDF_ERROR_CONTINUE(ret <= 0, \"<%s> TCP write\", strerror(errno));\r\n }\r\n\r\n MDF_LOGI(\"TCP client write task is exit\");\r\n\r\n close(g_sockfd);\r\n MDF_FREE(data);\r\n vTaskDelete(NULL);\r\n}\r\n\r\n/**\r\n * @brief Create a tcp client\r\n */\r\nstatic int socket_tcp_client_create(const char *ip, uint16_t port)\r\n{\r\n MDF_PARAM_CHECK(ip);\r\n\r\n MDF_LOGI(\"Create a tcp client, ip: %s, port: %d\", ip, port);\r\n\r\n mdf_err_t ret = ESP_OK;\r\n int sockfd = -1;\r\n struct sockaddr_in server_addr = {\r\n .sin_family = AF_INET,\r\n .sin_port = htons(port),\r\n .sin_addr.s_addr = inet_addr(ip),\r\n };\r\n\r\n sockfd = socket(AF_INET, SOCK_STREAM, 0);\r\n MDF_ERROR_GOTO(sockfd < 0, ERR_EXIT, \"socket create, sockfd: %d\", sockfd);\r\n\r\n ret = connect(sockfd, (struct sockaddr *)&server_addr, sizeof(struct sockaddr_in));\r\n MDF_ERROR_GOTO(ret < 0, ERR_EXIT, \"socket connect, ret: %d, ip: %s, port: %d\",\r\n ret, ip, port);\r\n return sockfd;\r\n\r\nERR_EXIT:\r\n\r\n if (sockfd != -1)\r\n {\r\n close(sockfd);\r\n }\r\n\r\n return -1;\r\n}\r\n\r\nvoid tcp_client_read_task(void *arg)\r\n{\r\n mdf_err_t ret = MDF_OK;\r\n char *data = MDF_MALLOC(MWIFI_PAYLOAD_LEN);\r\n size_t size = MWIFI_PAYLOAD_LEN;\r\n uint8_t dest_addr[MWIFI_ADDR_LEN] = {0x0};\r\n mwifi_data_type_t data_type = {0x0};\r\n\r\n MDF_LOGI(\"TCP client read task is running\");\r\n\r\n while (mwifi_is_connected())\r\n {\r\n if (g_sockfd == -1)\r\n {\r\n g_sockfd = socket_tcp_client_create(CONFIG_SERVER_IP, CONFIG_SERVER_PORT);\r\n\r\n if (g_sockfd == -1)\r\n {\r\n vTaskDelay(500 / portTICK_RATE_MS);\r\n continue;\r\n }\r\n }\r\n\r\n memset(data, 0, MWIFI_PAYLOAD_LEN);\r\n ret = read(g_sockfd, data, size);\r\n MDF_LOGI(\"TCP read, %d, size: %d, data: %s\", g_sockfd, size, data);\r\n\r\n if (ret <= 0)\r\n {\r\n MDF_LOGW(\"<%s> TCP read\", strerror(errno));\r\n close(g_sockfd);\r\n g_sockfd = -1;\r\n continue;\r\n }\r\n\r\n cJSON *pJson = NULL;\r\n cJSON *pSub = NULL;\r\n\r\n pJson = cJSON_Parse(data);\r\n MDF_ERROR_CONTINUE(!pJson, \"cJSON_Parse, data format error\");\r\n\r\n pSub = cJSON_GetObjectItem(pJson, \"addr\");\r\n\r\n if (!pSub)\r\n {\r\n MDF_LOGW(\"cJSON_GetObjectItem, Destination address not set\");\r\n cJSON_Delete(pJson);\r\n continue;\r\n }\r\n\r\n /**\r\n * @brief Convert mac from string format to binary\r\n */\r\n do\r\n {\r\n uint32_t mac_data[MWIFI_ADDR_LEN] = {0};\r\n sscanf(pSub->valuestring, MACSTR,\r\n mac_data, mac_data + 1, mac_data + 2,\r\n mac_data + 3, mac_data + 4, mac_data + 5);\r\n\r\n for (int i = 0; i < MWIFI_ADDR_LEN; i++)\r\n {\r\n dest_addr[i] = mac_data[i];\r\n }\r\n } while (0);\r\n\r\n pSub = cJSON_GetObjectItem(pJson, \"data\");\r\n\r\n if (!pSub)\r\n {\r\n MDF_LOGW(\"cJSON_GetObjectItem, Failed to get data\");\r\n cJSON_Delete(pJson);\r\n continue;\r\n }\r\n\r\n char *json_data = cJSON_PrintUnformatted(pSub);\r\n\r\n //trailing newline\r\n\r\n ret = mwifi_root_write(dest_addr, 1, &data_type, json_data, strlen(json_data), true);\r\n MDF_ERROR_CONTINUE(ret != MDF_OK, \"<%s> mwifi_root_write\", mdf_err_to_name(ret));\r\n\r\n MDF_FREE(json_data);\r\n cJSON_Delete(pJson);\r\n }\r\n\r\n MDF_LOGI(\"TCP client read task is exit\");\r\n\r\n close(g_sockfd);\r\n g_sockfd = -1;\r\n MDF_FREE(data);\r\n vTaskDelete(NULL);\r\n}\r\n\r\nstatic void node_read_task(void *arg)\r\n{\r\n mdf_err_t ret = MDF_OK;\r\n cJSON *pJson = NULL;\r\n cJSON *pSub = NULL;\r\n char *data = MDF_MALLOC(MWIFI_PAYLOAD_LEN);\r\n size_t size = MWIFI_PAYLOAD_LEN;\r\n mwifi_data_type_t data_type = {0x0};\r\n uint8_t src_addr[MWIFI_ADDR_LEN] = {0x0};\r\n\r\n MDF_LOGI(\"Note read task is running\");\r\n\r\n for (;;)\r\n {\r\n if (!mwifi_is_connected())\r\n {\r\n vTaskDelay(500 / portTICK_RATE_MS);\r\n continue;\r\n }\r\n\r\n size = MWIFI_PAYLOAD_LEN;\r\n memset(data, 0, MWIFI_PAYLOAD_LEN);\r\n ret = mwifi_read(src_addr, &data_type, data, &size, portMAX_DELAY);\r\n MDF_ERROR_CONTINUE(ret != MDF_OK, \"<%s> mwifi_read\", mdf_err_to_name(ret));\r\n MDF_LOGD(\"Node receive: \" MACSTR \", size: %d, data: %s\", MAC2STR(src_addr), size, data);\r\n\r\n pJson = cJSON_Parse(data);\r\n MDF_ERROR_CONTINUE(!pJson, \"cJSON_Parse, data format error, data: %s\", data);\r\n\r\n pSub = cJSON_GetObjectItem(pJson, \"status\");\r\n\r\n if (!pSub)\r\n {\r\n MDF_LOGW(\"cJSON_GetObjectItem, Destination address not set\");\r\n cJSON_Delete(pJson);\r\n continue;\r\n }\r\n\r\n gpio_set_level(CONFIG_LED_GPIO_NUM, pSub->valueint);\r\n\r\n cJSON_Delete(pJson);\r\n }\r\n\r\n MDF_LOGW(\"Note read task is exit\");\r\n\r\n MDF_FREE(data);\r\n vTaskDelete(NULL);\r\n}\r\n\r\nstatic void node_write_task(void *arg)\r\n{\r\n mdf_err_t ret = MDF_OK;\r\n int count = 0;\r\n size_t size = 0;\r\n char *data = NULL;\r\n mwifi_data_type_t data_type = {0x0};\r\n\r\n //mesh_addr_t parent_bssid = {0};\r\n\r\n MDF_LOGI(\"NODE task is running\");\r\n\r\n for (;;)\r\n {\r\n if (!mwifi_is_connected())\r\n {\r\n vTaskDelay(500 / portTICK_RATE_MS);\r\n continue;\r\n }\r\n\r\n //mesh rssi\r\n mesh_assoc_t mesh_assoc = {0x0};\r\n esp_wifi_vnd_mesh_get(&mesh_assoc);\r\n\r\n //parent ssid\r\n esp_mesh_get_parent_bssid(&parent_bssid);\r\n uint8_t sta_mac[MWIFI_ADDR_LEN] = {0};\r\n memcpy(sta_mac, parent_bssid.addr, MWIFI_ADDR_LEN);\r\n\r\n char str_mac[20];\r\n sprintf(str_mac, \"%02x:%02x:%02x:%02x:%02x:%02x\", sta_mac[0], sta_mac[1], sta_mac[2], sta_mac[3], sta_mac[4], sta_mac[5]);\r\n\r\n /// child macs\r\n // wifi_sta_list_t wifi_sta_list = {0x0};\r\n // esp_wifi_ap_get_sta_list(&wifi_sta_list);\r\n\r\n char *str_children;\r\n //sprintf(str_children, \"%d\", wifi_sta_list.num);\r\n\r\n // //char str_children[20] = \"[]\";\r\n\r\n cJSON *children;\r\n children = cJSON_CreateArray();\r\n\r\n for (int i = 0; i < wifi_sta_list.num; i++)\r\n {\r\n // //MDF_LOGI(\"Child mac: \" MACSTR, MAC2STR(wifi_sta_list.sta[i].mac));\r\n char str_child_mac[20];\r\n sprintf(str_child_mac, \"%02x:%02x:%02x:%02x:%02x:%02x\", wifi_sta_list.sta[i].mac[0], wifi_sta_list.sta[i].mac[1], wifi_sta_list.sta[i].mac[2], wifi_sta_list.sta[i].mac[3], wifi_sta_list.sta[i].mac[4], wifi_sta_list.sta[i].mac[5]);\r\n cJSON *arrayItem = cJSON_CreateString(str_child_mac);\r\n cJSON_AddItemToArray(children, arrayItem);\r\n }\r\n\r\n str_children = cJSON_Print(children);\r\n ///\r\n\r\n size = asprintf(&data, \"{\\\"seq\\\":%d,\\\"layer\\\":%d,\\\"status\\\":%d,\\\"version\\\":\\\"%s\\\",\\\"nodenum\\\":%d,\\\"parent\\\":\\\"%s\\\", \\\"rssi\\\": %d ,\\\"children\\\": %s }\",\r\n count++, esp_mesh_get_layer(), gpio_get_level(CONFIG_LED_GPIO_NUM), FIRMWARE_VERSION, esp_mesh_get_total_node_num(), str_mac, mesh_assoc.rssi, str_children);\r\n\r\n MDF_LOGD(\"Node send, size: %d, data: %s\", size, data);\r\n ret = mwifi_write(NULL, &data_type, data, size, true);\r\n MDF_FREE(data);\r\n MDF_ERROR_CONTINUE(ret != MDF_OK, \"<%s> mwifi_write\", mdf_err_to_name(ret));\r\n\r\n vTaskDelay(5000 / portTICK_RATE_MS);\r\n }\r\n\r\n MDF_LOGW(\"NODE task is exit\");\r\n\r\n vTaskDelete(NULL);\r\n}\r\n\r\nvoid node_mesh_write(char *dataraw, int len)\r\n{\r\n ESP_LOGI(TAG, \"RECV LEN: %d SERIAL:%s\", len, dataraw);\r\n //ESP_LOGI(TAG, \"NODEMESHTCPWRITE: %s\", dataraw);\r\n //ESP_LOG_BUFFER_HEX_LEVEL(TAG, dataraw, len, 0);\r\n\r\n // //cleanup incoming data\r\n // // char datatcp[len];\r\n\r\n // // for (int a = 0; a < len; a++ ) {\r\n // // datatcp[a] = dataraw[a];\r\n // // }\r\n // // datatcp[len-1] = '\\0';\r\n\r\n // // //memset(datatcp, '\\0', sizeof(datatcp));\r\n // // //strcpy(datatcp, dataraw);\r\n\r\n size_t size = 0;\r\n char *data = NULL;\r\n\r\n dataraw[strcspn(dataraw, \"\\n\")] = 0;\r\n dataraw[strcspn(dataraw, \"\\r\")] = 0;\r\n\r\n size = asprintf(&data, \"{\\\"tcp\\\": \\\"%s\\\", \\\"len\\\" : %d}\", dataraw, len);\r\n mwifi_data_type_t data_type = {0x0};\r\n\r\n MDF_LOGD(\"TCP mesh chars: %d send: %s\", len, data);\r\n\r\n if (mwifi_is_connected())\r\n {\r\n mwifi_write(NULL, &data_type, data, size, true);\r\n }\r\n MDF_FREE(data);\r\n}\r\n\r\n/**\r\n * @brief Timed printing system information\r\n */\r\nstatic void print_system_info_timercb(void *timer)\r\n{\r\n uint8_t primary = 0;\r\n wifi_second_chan_t second = 0;\r\n\r\n uint8_t sta_mac[MWIFI_ADDR_LEN] = {0};\r\n mesh_assoc_t mesh_assoc = {0x0};\r\n\r\n esp_wifi_get_mac(ESP_IF_WIFI_STA, sta_mac);\r\n esp_wifi_ap_get_sta_list(&wifi_sta_list);\r\n esp_wifi_get_channel(&primary, &second);\r\n esp_wifi_vnd_mesh_get(&mesh_assoc);\r\n esp_mesh_get_parent_bssid(&parent_bssid);\r\n\r\n // MDF_LOGI(\"System information, channel: %d, layer: %d, self mac: \" MACSTR \", parent bssid: \" MACSTR\r\n // \", parent rssi: %d, node num: %d, free heap: %u\",\r\n // primary,\r\n // esp_mesh_get_layer(), MAC2STR(sta_mac), MAC2STR(parent_bssid.addr),\r\n // mesh_assoc.rssi, esp_mesh_get_total_node_num(), esp_get_free_heap_size());\r\n\r\n// lib_display_setNodeNum(esp_mesh_get_total_node_num());\r\n// lib_display_setLayer(esp_mesh_get_layer());\r\n// lib_display_setRSSI(mesh_assoc.rssi);\r\n// lib_display_setMac(sta_mac);\r\n\r\n for (int i = 0; i < wifi_sta_list.num; i++)\r\n {\r\n //MDF_LOGI(\"Child mac: \" MACSTR, MAC2STR(wifi_sta_list.sta[i].mac));\r\n }\r\n\r\n#ifdef MEMORY_DEBUG\r\n if (!heap_caps_check_integrity_all(true))\r\n {\r\n MDF_LOGE(\"At least one heap is corrupt\");\r\n }\r\n\r\n mdf_mem_print_heap();\r\n mdf_mem_print_record();\r\n#endif /**< MEMORY_DEBUG */\r\n}\r\n\r\nstatic EventGroupHandle_t s_wifi_event_group;\r\n\r\n// static esp_err_t event_handler(void *ctx, system_event_t *event)\r\n// {\r\n// switch (event->event_id)\r\n// {\r\n// case SYSTEM_EVENT_AP_STACONNECTED:\r\n// ESP_LOGI(TAG, \"station:\" MACSTR \" join, AID=%d\",\r\n// MAC2STR(event->event_info.sta_connected.mac),\r\n// event->event_info.sta_connected.aid);\r\n// break;\r\n// case SYSTEM_EVENT_AP_STADISCONNECTED:\r\n// ESP_LOGI(TAG, \"station:\" MACSTR \"leave, AID=%d\",\r\n// MAC2STR(event->event_info.sta_disconnected.mac),\r\n// event->event_info.sta_disconnected.aid);\r\n// break;\r\n// default:\r\n// break;\r\n// }\r\n// return ESP_OK;\r\n// }\r\n\r\nstatic mdf_err_t wifi_init()\r\n{\r\n mdf_err_t ret = nvs_flash_init();\r\n wifi_init_config_t cfg = WIFI_INIT_CONFIG_DEFAULT();\r\n\r\n if (ret == ESP_ERR_NVS_NO_FREE_PAGES || ret == ESP_ERR_NVS_NEW_VERSION_FOUND)\r\n {\r\n MDF_ERROR_ASSERT(nvs_flash_erase());\r\n ret = nvs_flash_init();\r\n }\r\n\r\n MDF_ERROR_ASSERT(ret);\r\n\r\n tcpip_adapter_init();\r\n MDF_ERROR_ASSERT(esp_event_loop_init(NULL, NULL));\r\n MDF_ERROR_ASSERT(esp_wifi_init(&cfg));\r\n MDF_ERROR_ASSERT(esp_wifi_set_storage(WIFI_STORAGE_FLASH));\r\n MDF_ERROR_ASSERT(esp_wifi_set_mode(WIFI_MODE_AP));\r\n MDF_ERROR_ASSERT(esp_wifi_set_ps(WIFI_PS_NONE));\r\n MDF_ERROR_ASSERT(esp_mesh_set_6m_rate(false));\r\n\r\n s_wifi_event_group = xEventGroupCreate();\r\n\r\n // ESP_ERROR_CHECK(esp_event_loop_init(event_handler, NULL));\r\n\r\n ESP_ERROR_CHECK(esp_wifi_init(&cfg));\r\n wifi_config_t wifi_config = {\r\n .ap = {\r\n .ssid = CONFIG_ROUTER_SSID,\r\n .ssid_len = strlen(CONFIG_ROUTER_SSID),\r\n .password = CONFIG_ROUTER_PASSWORD,\r\n .max_connection = 10,\r\n .authmode = WIFI_AUTH_WPA_WPA2_PSK},\r\n };\r\n if (strlen(CONFIG_ROUTER_PASSWORD) == 0)\r\n {\r\n wifi_config.ap.authmode = WIFI_AUTH_OPEN;\r\n }\r\n\r\n ESP_ERROR_CHECK(esp_wifi_set_mode(WIFI_MODE_AP));\r\n ESP_ERROR_CHECK(esp_wifi_set_config(ESP_IF_WIFI_AP, &wifi_config));\r\n // ESP_ERROR_CHECK(esp_wifi_start());\r\n MDF_ERROR_ASSERT(esp_wifi_start());\r\n\r\n ESP_LOGI(TAG, \"wifi_init_softap finished.SSID:%s password:%s\",\r\n CONFIG_ROUTER_SSID, CONFIG_ROUTER_PASSWORD);\r\n\r\n return MDF_OK;\r\n}\r\n\r\n/**\r\n * @brief All module events will be sent to this task in esp-mdf\r\n *\r\n * @Note:\r\n * 1. Do not block or lengthy operations in the callback function.\r\n * 2. Do not consume a lot of memory in the callback function.\r\n * The task memory of the callback function is only 4KB.\r\n */\r\nstatic mdf_err_t event_loop_cb(mdf_event_loop_t event, void *ctx)\r\n{\r\n MDF_LOGI(\"event_loop_cb, event: %d\", event);\r\n\r\n switch (event)\r\n {\r\n case MDF_EVENT_MWIFI_STARTED:\r\n MDF_LOGI(\"MESH is started\");\r\n break;\r\n\r\n case MDF_EVENT_MWIFI_PARENT_CONNECTED:\r\n MDF_LOGI(\"Parent is connected on station interface\");\r\n break;\r\n\r\n case MDF_EVENT_MWIFI_PARENT_DISCONNECTED:\r\n MDF_LOGI(\"Parent is disconnected on station interface\");\r\n break;\r\n\r\n case MDF_EVENT_MWIFI_ROUTING_TABLE_ADD:\r\n case MDF_EVENT_MWIFI_ROUTING_TABLE_REMOVE:\r\n MDF_LOGI(\"total_num: %d\", esp_mesh_get_total_node_num());\r\n break;\r\n\r\n case MDF_EVENT_MWIFI_ROOT_GOT_IP:\r\n {\r\n MDF_LOGI(\"Root obtains the IP address. It is posted by LwIP stack automatically\");\r\n xTaskCreate(tcp_client_write_task, \"tcp_client_write_task\", 4 * 1024,\r\n NULL, CONFIG_MDF_TASK_DEFAULT_PRIOTY, NULL);\r\n xTaskCreate(tcp_client_read_task, \"tcp_server_read\", 4 * 1024,\r\n NULL, CONFIG_MDF_TASK_DEFAULT_PRIOTY, NULL);\r\n break;\r\n }\r\n\r\n default:\r\n break;\r\n }\r\n\r\n return MDF_OK;\r\n}\r\n\r\nstatic duk_ret_t native_print(duk_context *ctx)\r\n{\r\n duk_push_string(ctx, \" \");\r\n duk_insert(ctx, 0);\r\n duk_join(ctx, duk_get_top(ctx) - 1);\r\n printf(\"%s\\n\", duk_safe_to_string(ctx, -1));\r\n return 0;\r\n}\r\n\r\nstatic duk_ret_t native_adder(duk_context *ctx)\r\n{\r\n int i;\r\n int n = duk_get_top(ctx); /* #args */\r\n double res = 0.0;\r\n\r\n for (i = 0; i < n; i++)\r\n {\r\n res += duk_to_number(ctx, i);\r\n }\r\n\r\n duk_push_number(ctx, res);\r\n return 1; /* one return value */\r\n}\r\n\r\nvoid app_main()\r\n{\r\n\r\n // javascript:\r\n // int test = 0;\r\n\r\n // while (test == 0)\r\n // {\r\n // duk_context *ctx = duk_create_heap_default();\r\n\r\n // duk_push_c_function(ctx, native_print, DUK_VARARGS);\r\n // duk_put_global_string(ctx, \"print\");\r\n // duk_push_c_function(ctx, native_adder, DUK_VARARGS);\r\n // duk_put_global_string(ctx, \"adder\");\r\n\r\n // duk_eval_string(ctx, \"print('Hello world!');\");\r\n\r\n // duk_eval_string(ctx, \"print('2+3=' + adder(2, 3));\");\r\n // duk_pop(ctx); /* pop eval result */\r\n\r\n // duk_destroy_heap(ctx);\r\n\r\n // test = 1;\r\n // }\r\n\r\n //xTaskCreate(task_test_SSD1306i2c, \"task_test_SSD1306i2c\", 4 * 1024, NULL, CONFIG_MDF_TASK_DEFAULT_PRIOTY, NULL);\r\n\r\n xTaskCreate(serial_port_task, \"serial_port_task\", 4096, NULL, 5, NULL);\r\n\r\n mwifi_init_config_t cfg = MWIFI_INIT_CONFIG_DEFAULT();\r\n mwifi_config_t config = {\r\n .router_ssid = CONFIG_ROUTER_SSID,\r\n .router_password = CONFIG_ROUTER_PASSWORD,\r\n .mesh_id = CONFIG_MESH_ID,\r\n .mesh_password = CONFIG_MESH_PASSWORD,\r\n };\r\n\r\n /**\r\n * @brief Set the log level for serial port printing.\r\n */\r\n esp_log_level_set(\"*\", ESP_LOG_INFO);\r\n esp_log_level_set(TAG, ESP_LOG_DEBUG);\r\n\r\n gpio_pad_select_gpio(CONFIG_LED_GPIO_NUM);\r\n gpio_set_direction(CONFIG_LED_GPIO_NUM, GPIO_MODE_INPUT_OUTPUT);\r\n\r\n /**\r\n * @brief Initialize wifi mesh.\r\n */\r\n MDF_ERROR_ASSERT(mdf_event_loop_init(event_loop_cb));\r\n MDF_ERROR_ASSERT(wifi_init());\r\n MDF_ERROR_ASSERT(mwifi_init(&cfg));\r\n MDF_ERROR_ASSERT(mwifi_set_config(&config));\r\n MDF_ERROR_ASSERT(mwifi_start());\r\n\r\n /**\r\n * @breif Create handler\r\n */\r\n xTaskCreate(node_write_task, \"node_write_task\", 4 * 1024,\r\n NULL, CONFIG_MDF_TASK_DEFAULT_PRIOTY, NULL);\r\n xTaskCreate(node_read_task, \"node_read_task\", 4 * 1024,\r\n NULL, CONFIG_MDF_TASK_DEFAULT_PRIOTY, NULL);\r\n\r\n TimerHandle_t timer = xTimerCreate(\"print_system_info\", 1000 / portTICK_RATE_MS,\r\n true, NULL, print_system_info_timercb);\r\n xTimerStart(timer, 0);\r\n\r\n ESP_LOGI(TAG, \"ESP_WIFI_MODE_AP\");\r\n}\r\n\r\n//---main.h\r\n#ifndef __MESH_MAIN__\r\n#define __MESH_MAIN__\r\n\r\n#define FIRMWARE_VERSION \"0.0.12\"\r\n\r\nvoid node_mesh_write(char *data, int len);\r\n\r\n#endif\r\n\r\n\r\n```\r\n// If your code is longer than 30 lines, [GIST](https://gist.github.com) is preferred.\r\n\r\n## Debug Logs\r\n\r\n```\r\nDebug log goes here, should contain the backtrace, as well as the reset source if it is a crash.\r\nPlease copy the plain text here for us to search the error log. Or attach the complete logs but leave the main part here if the log is *too* long.\r\n```\r\n\r\n## Other items if possible\r\n\r\n- [ ] sdkconfig file (attach the sdkconfig file from your project folder)\r\n- [ ] elf file in the ``build`` folder (**note this may contain all the code details and symbols of your project.**)\r\n- [ ] coredump (This provides stacks of tasks.) \r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Are you going to add this to the docs? \n\nWould be nice to have complete instructions and it seems like there is something missing to get Action Buttons working.\r\n\r\n\r\nhttps://github.com/Iterable/swift-sdk/blob/33d943c450bcc1d6530edae6091f44a3db505a59/notification-extension/ITBNotificationServiceExtension.swift#L20","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Rust: Awesome Rust\n\nhttps://github.com/ctjhoa/rust-learning/blob/master/README.md","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation needs update","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Round out upload functions","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"?modelspec","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# add readme with usage details\n\nLets setup a README file and include needed details for running dev setup, building a release version, and anything setup and configuration related","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Update Jobs Docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"DS should force the path to the ceph.conf, keyring and CephX username of openATTIC","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"new read.rdf that returns data frame instead of list","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"aio: fix links to source when path contains symlinks","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# SDK blockchain simulator docs\n\n<!-- < < < < < < < < < < < < < < < < < < < < < < < < < < < < < < < < < \u263a \r\nv \u2730 Thanks for opening an issue! \u2730 \r\nv Before smashing the submit button please review the template.\r\nv Word of caution: poorly thought-out proposals may be rejected \r\nv without deliberation \r\n\u263a > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > > -->\r\n\r\n## Summary\r\n\r\n<!-- Short, concise description of the proposed feature -->\r\nAdd and update the simulation docs with the new `SimulationManager`, `x/<module>/simulation` structure, tests usage, integration, etc:\r\n\r\n- [ ] Spec for `x/simulation`\r\n- [ ] `SimApp` usage docs\r\n- [ ] Godocs\r\n\r\n____\r\n\r\n#### For Admin Use\r\n\r\n- [ ] Not duplicate issue\r\n- [ ] Appropriate labels applied\r\n- [ ] Appropriate contributors tagged\r\n- [ ] Contributor assigned/self-assigned\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Buildpack failing - unexpected tLABEL error","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"When custom domain used for azure storage - SDK breaks signature","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Please update README.md","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"making multipart/form-data requests","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Post widget in different directory\n\nHi. \r\n\r\nI'm currently trying to use the post widget for a different menu directory. I currently have the `content/home/post.md` file, which looks at recent I've created. Now I've created a menu with the following dropdown selection.\r\n\r\n```\r\n[[main]]\r\n name = \"Home\"\r\n url = \"#about\"\r\n weight = 10\r\n\r\n[[main]]\r\n name = \"Posts\"\r\n url = \"#posts\"\r\n identifier = \"dropdown\"\r\n weight = 20\r\n hasChildren = true\r\n\r\n [[main]]\r\n name = \"Finance\"\r\n url = \"/Finance\"\r\n weight = 20\r\n parent = \"dropdown\"\r\n\r\n [[main]]\r\n name = \"Economics\"\r\n url = \"/Economics/\"\r\n weight = 20\r\n parent = \"dropdown\"\r\n\r\n [[main]]\r\n name = \"Data Science\"\r\n url = \"/Data Science/\"\r\n weight = 20\r\n parent = \"dropdown\"\r\n\r\n [[main]]\r\n name = \"Machine Learning\"\r\n url = \"/Machine Learning\"\r\n weight = 20\r\n parent = \"dropdown\"\r\n\r\n [[main]]\r\n name = \"Programming\"\r\n url = \"/Programming/\"\r\n weight = 20\r\n parent = \"dropdown\"\r\n```\r\n\r\nI've created a new post widget called finance.md, which my dropdown menu links to. I was hoping to have the finance.md show only posts in the finance category by changing the filter option in the widget, but that was unsuccessful. I was wondering what I was missing, what I'm doing wrong, or if I'm on the right track. I've looked at the documentation, but I couldn't really find anything related to what I'm trying to do.\r\n\r\nThanks in advance.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add Using with Stylelint section","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Issue SV-Zanshin","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Frame is always displayed, also when in other panel\n\n# Frame is always displayed, also when a different tab is active\r\n\r\n## Expected\r\nCompodoc should only display when compodoc tab is active\r\n\r\n## Actual\r\nFrame is always displayed, eg when knobs tab is active\r\n\r\n## Screenshots\r\n\r\nNote the active tab!\r\n\r\n\r\n\r\nNote the knobs underneath!\r\n\r\n\r\n\r\n## Proposed solution\r\nUse active prop - PR: https://github.com/wgrabowski/storybook-addon-compodoc/pull/4\r\n### Solution taken from:\r\nhttps://github.com/storybookjs/storybook/blob/master/addons/knobs/src/components/Panel.js#L162\r\nhttps://github.com/storybookjs/storybook/blob/master/addons/knobs/src/register.js#L10\r\n### More\r\nhttps://storybook.js.org/docs/addons/writing-addons/ `return active ? <div /> : null;`","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Razorpay: Use a built release version of the razorpay-php library instead of the source","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# `boot.crashDump.enable = true` severely overheats CPU\n\n**Describe the bug**\r\nOn NixOS generations with boot.crashDump.enable = true, my CPU temperature is consistently > 99\u00b0 C (and fans at 100%) while my \"CPU usage\" (according to polybar) is <10%.\r\n\r\nI'd provide more debugging information but I'm currently too afraid of damaging my hardware to explore without a plan in mind (CPU temps were peaking at 105\u00b0 C).\r\n\r\n**To Reproduce** \r\n(I am using a Macbook Pro 12,1)\r\n1. add `boot.crashDump.enable = true` to `configuration.nix`\r\n2. `sudo nixos-rebuild boot`\r\n3. Boot to new generation\r\n4. CPU is severely overheating\r\n5. Boot to previous generation (the one w/out crashDump)\r\n6. Everything is working normally\r\n\r\n**Expected behavior**\r\nI expected a NixOS generation with `crashDump` installed to only slightly slow down my system, without causing overheating.\r\n\r\n**Additional context**\r\nI am trying to use `crashDump` in an attempt to debug #61851.\r\n\r\n**Metadata**\r\n```\r\n - system: `\"x86_64-linux\"`\r\n - host os: `Linux 5.1.19, NixOS, 19.03.173172.a607a931f6f (Koi)`\r\n - multi-user?: `yes`\r\n - sandbox: `yes`\r\n - version: `nix-env (Nix) 2.2.2`\r\n - channels(ajanse): `\"unstable-19.09pre185259.362be9608c3\"`\r\n - channels(root): `\"nixos-19.03.173172.a607a931f6f\"`\r\n - nixpkgs: `/nix/var/nix/profiles/per-user/root/channels/nixos`\r\n```\r\n\r\nMaintainer information:\r\n```yaml\r\n# a list of nixos modules affected by the problem\r\nmodule:\r\n- boot.crashDump\r\n```\r\n\r\n(I'm not sure if I populated the above code block correctly)\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Typing Japanese with Windows IME on IE11/Edge does not fire change events","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# TypeScript type error for changefreq property\n\nThere seems to be an issue with the enums for `changefreq` when using TypeScript. Using an example from the documentation:\r\n\r\n```typescript\r\nvar sitemap = smGenerator.createSitemap({\r\n urls: [\r\n {\r\n url: 'http://mobile.test.com/page-1/',\r\n changefreq: 'weekly',\r\n priority: 0.3,\r\n mobile: true,\r\n },\r\n ],\r\n xslUrl: 'sitemap.xsl',\r\n });\r\n```\r\n\r\nI get the following error:\r\n\r\n```\r\nType '{ url: string; changefreq: string; priority: number; mobile: boolean; }' is not assignable to type 'string | SitemapItemOptions'.\r\n Type '{ url: string; changefreq: string; priority: number; mobile: boolean; }' is not assignable to type 'SitemapItemOptions'.\r\n Types of property 'changefreq' are incompatible.\r\n Type 'string' is not assignable to type 'EnumChangefreq'.ts(2322)\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Two masters using one etcd(High Availability of Masters)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Using SITL for ArduPilot Testing","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CPT Item Not Found on User Dashboard\n\nHello,\r\n\r\n- I have configured the form correctly so that it creates a CPT item\r\n- the CPT item is clearly visible in the back end and on the site\r\n\r\nProblem = on the front end user's account this indicates \"No items found. \" . So it is currently impossible for the user to see his publication and thus make changes or cancel \r\n\r\nThank you for your future answer: -)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Documentation says to use credentials.py, but db string is taken from ~/replica.my.cnf\n\nThe documentation in the README says that credentials.py should be updated. This database connection string was used in previous versions of the lucky bot that used SQLAlchemy. However, with the change to use pymysql directly, the DB string in credentials.py is no longer explicitly used.\r\n\r\nThis will cause breakage with Dockerization.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Typos and such in README.md","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Update build-from-source instructions","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Root domain missing styles\n\n*Reproduce*\r\n1. Visit root docs doman w/ a browser: [https://viewdocs.com](https://viewdocs.com)\r\n2. CSS resource 404s: https://static.gist.io/css/screen.css 404\r\n\r\n\r\nThe screen.css suggests `static.gist.io` is a ZEiT now app","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Test-PlasterManifest 'Error: The 'powerShellVersion' attribute is not declared.'","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Plugin tries to clean project dir","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Job is run immediately if app is paused before scheduling","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Webpack build file missing, front-end assets cannot be loaded Error\n\nInstructions:\r\n\r\nHello, I followed the instruction in Kolibri developer documentation.\r\nAnd after run server with the command \"yarn run devserver\" with Pipenv set, I got this error below. I tried several times set env and unset, but couldn't find a way to fix. Did I miss to type some Webpack bundle command?\r\n\r\n\r\nserver side\r\n```\r\nkolibri.core.webpack.hooks.WebpackError: Webpack build file missing, front-end assets cannot be loaded. Problems loading: /Users/hyunahn/.local/share/virtualenvs/kolibri-nE0xMgB-/lib/python3.7/site-packages/kolibri_exercise_perseus_plugin/build/_stats.json\r\nERROR \"GET /en/setup_wizard/ HTTP/1.1\" 500 227678\r\n```\r\n\r\nclient side\r\n\r\n```\r\n\r\nRequest Method: | GET\r\n-- | --\r\nhttp://localhost:8000/en/setup_wizard/\r\n1.11.22\r\nWebpackError\r\nWebpack build file missing, front-end assets cannot be loaded. Problems loading: /Users/hyunahn/.local/share/virtualenvs/kolibri-nE0xMgB-/lib/python3.7/site-packages/kolibri_exercise_perseus_plugin/build/_stats.json\r\n/Users/hyunahn/github/kolibri/kolibri/core/webpack/hooks.py in _stats_file_content, line 139\r\n/Users/hyunahn/.local/share/virtualenvs/kolibri-nE0xMgB-/bin/python3\r\n3.7.3\r\n['/Users/hyunahn/github/kolibri/kolibri/dist', '/Users/hyunahn/.local/share/virtualenvs/kolibri-nE0xMgB-/bin', '/Users/hyunahn/.local/share/virtualenvs/kolibri-nE0xMgB-/lib/python37.zip', '/Users/hyunahn/.local/share/virtualenvs/kolibri-nE0xMgB-/lib/python3.7', '/Users/hyunahn/.local/share/virtualenvs/kolibri-nE0xMgB-/lib/python3.7/lib-dynload', '/Library/Frameworks/Python.framework/Versions/3.7/lib/python3.7', '/Users/hyunahn/.local/share/virtualenvs/kolibri-nE0xMgB-/lib/python3.7/site-packages', '/Users/hyunahn/github/kolibri']\r\nSun, 11 Aug 2019 16:46:33 +0900\r\n\r\n\r\n````\r\n\r\n\r\nTell us about your environment, including:\r\n * Kolibri version: latest dev\r\n * Operating system: Mac os 10.14.6\r\n * Browser: Chrome\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Locally reference patches in Makefile aren't applied","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Update schema-interpreter to SchemaBuilder\n\nWe used this tool early on to generate schemas to start the documentation. At this point the code itself is useless as the schemas have organically changed over the last year.\r\n\r\nWe should refactor/make a new script that can interpret schemas and supply a light API to add fields conditionally, wholly or otherwise.\r\n\r\nIt is known that we are going to be adding the \"Introduced in\" field to all fields in the near future and this will assist that and any future issues.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"How do I swap between modules during runtime?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Installation error","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"KeyError: \"Unable to open object (Object 'dense_2' doesn't exist)\"","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# scala-steward can't run on sbt projects that depends on other sbt project that is downloaded by git submodule\n\nIf the sbt project depends on the other sbt project that is downloaded using git submodule like this:\r\n\r\n```\r\n[submodule \"submodule\"]\r\n path = submodule\r\n url = [email protected]:test/submodule.git\r\n # Assuming that test/submodule is also a sbt project.\r\n```\r\n\r\n```\r\nlazy val submodule = (project in file(\"submodule\"))\r\n```\r\n\r\nScala Steward will crash when it runs on such repository with \"Changes not staged for commit\".\r\n\r\nThis is because\r\n- scala-steward edits the dependencies **in submodule project**\r\n- Try to commit it\r\n- But it is impossible to commit the changes in the submodule work tree, so `git commit` end up with \"Changes not staged for commit\".\r\n\r\n## Solution\r\nWe can avoid this problem by using the `--ignore-submodules` option when checking the changes. https://git-scm.com/docs/git-status\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# NGINX: Canary doesn't progress over 5%\n\nThanks for the great project!\r\n\r\nI'm a first time user and trying to set Flagger up with Nginx. My issue is that when I update the image tag, the canary CRD doesn't move above 5%. The controller seems to start from scretch every iteration:\r\n\r\n```\r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:15:27.148Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:15:37.140Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:15:37.151Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:15:47.141Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:15:47.151Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:15:57.137Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:15:57.146Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:07.142Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:07.148Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:17.143Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:17.152Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:27.149Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:27.159Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:37.115Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:37.118Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:47.141Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:47.151Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:57.142Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:16:57.155Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:17:07.141Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:17:07.150Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:17:17.144Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Starting canary analysis for podinfo.test\",\"canary\":\"podinfo.test\"} \r\n{\"level\":\"info\",\"ts\":\"2019-08-11T08:17:17.153Z\",\"caller\":\"controller/controller.go:261\",\"msg\":\"Advance podinfo.test canary weight 5\",\"canary\":\"podinfo.test\"\r\n```\r\n\r\nThe only difference I noticed compared to https://docs.flagger.app/usage/nginx-progressive-delivery is that `ingresses.extensions/podinfo-canary` is not created for me when I create the CRD.\r\n\r\n```\r\n# generated \r\ndeployment.apps/podinfo-primary\r\nhorizontalpodautoscaler.autoscaling/podinfo-primary\r\nservice/podinfo\r\nservice/podinfo-canary\r\nservice/podinfo-primary\r\ningresses.extensions/podinfo-canary\r\n```\r\n\r\nI see no errors, the load generator is running..\r\n\r\nDoes anyone have an idea where to look for an error?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Azure Storage Authentication Error","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"#153 introduced compile problems with AOT","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"How do i customize nativebase","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add support for GraphQL","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Document Nuxt integration\n\nIt would be good to include a brief explanation in the documentation about how to integrate vue-shortkey into Nuxt.\r\n\r\nAdd `/plugins/vue-shortkey.js`:\r\n\r\n import Vue from 'vue'\r\n const ShortKey = require('vue-shortkey')\r\n\r\n Vue.use(ShortKey, { prevent: ['input', 'textarea'] })\r\n\r\n export default ShortKey\r\n\r\nLoad the plugin in `nuxt.config.js`:\r\n\r\n plugins: [ { src: '@/plugins/vue-shortkey.js', ssr: false }]\r\n\r\nI don't have time to make a PR right now, so consider this a reminder. The disabling of SSR is especially important.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# GLOBAL SANDBOX ID,KEY,SALT (CONFIG DETAILS)\n\nSo, my test + live Merchant key & salt is identically same,\r\nPayu payment interface does open while using sandbox: false, \r\nwhen using sandbox: false it closes with Some error occurred toast ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Remove imports of javax.security.auth.message.** classes","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Improve README\n\nIt should briefly explain how to use the repo, contrasted with how to use a release.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add Swagger Docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Please add me.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"\"Bind\" vs \"on\"?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Failed to start Docker Application Container Engine. dockerd is not starting up.\n\n**_I'm Not bale to start Docker deamon on my system_**. I've tried a lot\r\nWhile running this command `sudo /usr/bin/dockerd -H unix:// -H tcp://0.0.0.0:2736`\r\n**Error is :**\r\n```\r\nINFO[2019-08-11T12:33:28.014891347+05:30] Starting up \r\nWARN[2019-08-11T12:33:28.055712897+05:30] [!] DON'T BIND ON ANY IP ADDRESS WITHOUT setting --tlsverify IF YOU DON'T KNOW WHAT YOU'RE DOING [!] \r\nINFO[2019-08-11T12:33:28.056828188+05:30] detected 127.0.0.53 nameserver, assuming systemd-resolved, so using resolv.conf: /run/systemd/resolve/resolv.conf \r\nINFO[2019-08-11T12:33:28.163263654+05:30] parsed scheme: \"unix\" module=grpc\r\nINFO[2019-08-11T12:33:28.163393851+05:30] scheme \"unix\" not registered, fallback to default scheme module=grpc\r\nINFO[2019-08-11T12:33:28.163491789+05:30] ccResolverWrapper: sending update to cc: {[{unix:///run/containerd/containerd.sock 0 <nil>}] } module=grpc\r\nINFO[2019-08-11T12:33:28.163553280+05:30] ClientConn switching balancer to \"pick_first\" module=grpc\r\nINFO[2019-08-11T12:33:28.177835197+05:30] pickfirstBalancer: HandleSubConnStateChange: 0xc0008248b0, CONNECTING module=grpc\r\nINFO[2019-08-11T12:33:28.178304336+05:30] blockingPicker: the picked transport is not ready, loop back to repick module=grpc\r\nINFO[2019-08-11T12:33:28.277757380+05:30] pickfirstBalancer: HandleSubConnStateChange: 0xc0008248b0, READY module=grpc\r\nINFO[2019-08-11T12:33:28.307256890+05:30] parsed scheme: \"unix\" module=grpc\r\nINFO[2019-08-11T12:33:28.307363806+05:30] scheme \"unix\" not registered, fallback to default scheme module=grpc\r\nINFO[2019-08-11T12:33:28.307429100+05:30] ccResolverWrapper: sending update to cc: {[{unix:///run/containerd/containerd.sock 0 <nil>}] } module=grpc\r\nINFO[2019-08-11T12:33:28.307467930+05:30] ClientConn switching balancer to \"pick_first\" module=grpc\r\nINFO[2019-08-11T12:33:28.307702267+05:30] pickfirstBalancer: HandleSubConnStateChange: 0xc000824d70, CONNECTING module=grpc\r\nINFO[2019-08-11T12:33:28.307797788+05:30] blockingPicker: the picked transport is not ready, loop back to repick module=grpc\r\nINFO[2019-08-11T12:33:28.309296205+05:30] pickfirstBalancer: HandleSubConnStateChange: 0xc000824d70, READY module=grpc\r\nINFO[2019-08-11T12:33:28.690331742+05:30] [graphdriver] using prior storage driver: overlay2 \r\nWARN[2019-08-11T12:33:29.033660669+05:30] Your kernel does not support swap memory limit \r\nWARN[2019-08-11T12:33:29.033755548+05:30] Your kernel does not support cgroup rt period \r\nWARN[2019-08-11T12:33:29.033781272+05:30] Your kernel does not support cgroup rt runtime \r\nWARN[2019-08-11T12:33:29.033802421+05:30] Your kernel does not support cgroup blkio weight \r\nWARN[2019-08-11T12:33:29.033832885+05:30] Your kernel does not support cgroup blkio weight_device \r\nINFO[2019-08-11T12:33:29.036414847+05:30] Loading containers: start. \r\nINFO[2019-08-11T12:33:31.465795155+05:30] stopping event stream following graceful shutdown error=\"<nil>\" module=libcontainerd namespace=moby\r\nfailed to start daemon: Error initializing network controller: error obtaining controller instance: failed to create NAT chain DOCKER: iptables failed: iptables -t nat -N DOCKER: iptables v1.6.1: can't initialize iptables table `nat': Memory allocation problem\r\nPerhaps iptables or your kernel needs to be upgraded.\r\n (exit status 3)\r\n```\r\n**Docker Service status is:**\r\n`systemctl status docker.service`\r\n\r\n**Output:**\r\n\r\n```\r\n\u25cf docker.service - Docker Application Container Engine\r\n Loaded: loaded (/lib/systemd/system/docker.service; enabled; vendor preset: enabled)\r\n Drop-In: /etc/systemd/system/docker.service.d\r\n \u2514\u2500hosts.conf\r\n Active: failed (Result: exit-code) since Sun 2019-08-11 12:35:04 IST; 16min ago\r\n Docs: https://docs.docker.com\r\n Process: 12473 ExecStart=/usr/bin/dockerd -H unix:// -H tcp://0.0.0.0:2736 (code=exited, status=1/FAILURE)\r\n Main PID: 12473 (code=exited, status=1/FAILURE)\r\n\r\nAug 11 12:35:04 BlackCat systemd[1]: docker.service: Service RestartSec=2s expired, scheduling restart.\r\nAug 11 12:35:04 BlackCat systemd[1]: docker.service: Scheduled restart job, restart counter is at 3.\r\nAug 11 12:35:04 BlackCat systemd[1]: Stopped Docker Application Container Engine.\r\nAug 11 12:35:04 BlackCat systemd[1]: docker.service: Start request repeated too quickly.\r\nAug 11 12:35:04 BlackCat systemd[1]: docker.service: Failed with result 'exit-code'.\r\nAug 11 12:35:04 BlackCat systemd[1]: Failed to start Docker Application Container Engine.\r\n```\r\n\r\n**Docker version:**\r\n```\r\nClient: Docker Engine - Community\r\n Version: 19.03.1\r\n API version: 1.40\r\n Go version: go1.12.5\r\n Git commit: 74b1e89\r\n Built: Thu Jul 25 21:21:22 2019\r\n OS/Arch: linux/amd64\r\n Experimental: false\r\nGot permission denied while trying to connect to the Docker daemon socket at unix:///var/run/docker.sock: Get http://%2Fvar%2Frun%2Fdocker.sock/v1.40/version: dial unix /var/run/docker.sock: connect: permission denied\r\n```\r\n\r\n**System info:**\r\n```\r\nDistributor ID: Ubuntu\r\nDescription: Ubuntu 19.04\r\nRelease: 19.04\r\nCodename: disco\r\nLinux kernel version : 5.1.7\r\n```\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Issues deploying code in Chapter 6\n\nI encountered some issues when reading Chapter 6 and deploying the code.\r\n\r\nI decided not to do a pull request for these issues since not all of them have simple fixes and you may want to rearrange names of env vars instead of using the workaround I used to get the region env var in for it to run.\r\n\r\n**Issue 1:**\r\n\r\nRight before `6.6. Testing the Pipeline`, on page 27 of the current PDF, there is a deploy command. When I ran it I got an error:\r\n\r\n```\r\nServerless Error ---------------------------------------\r\n \r\n An error occurred: TranslateEventSourceMappingKinesisC6ptransstream - Unrecognized event source, must be kinesis, dynamodb stream or sqs. Expected arn of format arn:aws:kinesis:us-east-1:111122223333:stream/my-stream (Service: AWSLambda; Status Code: 400; Error Code: InvalidParameterValueException; Request ID: e3a2c3e3-9415-4edf-bdfc-070188f96e47).\r\n```\r\n\r\nThe error message included `us-east-1` so I thought it might an issue with how the region was substituted.\r\n\r\nI replaced `AWS_REGION` with `TARGET_REGION` in the `.env` file we're preparing and then it worked. My `.env` file included vars for Kinesis that looked like this after this change:\r\n\r\n```\r\nCHAPTER6_PIPELINE_TRANSLATE_STREAM_ARN=arn:aws:kinesis:${TARGET_REGION}:${AWS_ACCOUNT_ID}:stream/${CHAPTER6_PIPELINE_TRANSLATE_STREAM}\r\nCHAPTER6_PIPELINE_SENTIMENT_STREAM_ARN=arn:aws:kinesis:${TARGET_REGION}:${AWS_ACCOUNT_ID}:stream/${CHAPTER6_PIPELINE_SENTIMENT_STREAM}\r\nCHAPTER6_CLASSIFIER_NAME=chap6classifier\r\nCHAPTER6_CLASSIFIER_ARN=arn:aws:comprehend:${TARGET_REGION}:${AWS_ACCOUNT_ID}:document-classifier/${CHAPTER6_CLASSIFIER_NAME}\r\n```\r\n\r\n**Issue 2:**\r\n\r\nI ran `streamReader.js` where we're instructed to, and I got the following error:\r\n\r\n```\r\nConfigError: Missing region in config\r\nat Request.VALIDATE_REGION\r\n```\r\n\r\nThat was the top of the stacktrace.\r\n\r\nI found a solution here: https://stackoverflow.com/questions/47009074/configuration-error-missing-region-in-config-aws\r\n\r\nI added this code:\r\n\r\n```js\r\nAWS.config.update({\r\n region: env.TARGET_REGION,\r\n});\r\n```\r\n\r\nin `streamReader.js` after the requires but before the AWS SDK is called. It worked.\r\n\r\n**Issue 3:**\r\n\r\nI had the same problem in train-classifier.js later in the chapter when I ran `train.sh`. Solved the same way, adding the region override code. This script didn't use the `dotenv` library so I had to use `process.env` to access the target region.\r\n\r\n```js\r\nAWS.config.update({\r\n region: process.env.TARGET_REGION\r\n})\r\n```\r\n\r\nI also had to add `export TARGET_REGION` to `train.sh` so that it would carry the env var through to the Node script. Then it ran properly.\r\n\r\nI had to make this same change to `status.sh` and `check-status.js` to be able to run it.\r\n\r\n**Issue 4:**\r\n\r\nWhen deploying the `classify` service, I got an error:\r\n\r\n```\r\nServerless Error ---------------------------------------\r\n \r\n Invalid variable reference syntax for variable env.CHAPTER6_DATA_ACCESS_ARN. You can only reference env vars, options, & files. You can check our docs for more info.\r\n```\r\n\r\nI examined the different `serverless.yml` files to compare them and noticed that this variable was being accessed with a period instead of a colon. I changed `env.` to `env:` in this service and then the deploy worked.\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Work out redux immutability issues","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# How to install this with vim\n\nHi,\r\n I found the discussion hard to follow. I want to use cxxd with its accompanying vim plugin. Can you please outline the steps what needs to be done.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Create a findings goal to reliably display issues found after a scan\n\nThe Metrics API is not something that should be called directly after the bom upload as a number of async tasks are executed to create the Metrics analysis result.\r\n\r\nHowever the findings API is queryable after the bom upload and can provide details on the findings in the upload immediately.\r\n\r\n**Acceptance Criteria**\r\n- Add a `findings` goal to reliably display vulnerable components after a bom upload\r\n- Document the new goal and provide recommended usage\r\n- Update metrics documentation to describe its recommended usage","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# How are cubical subtypes supposed to be used?\n\n```\r\n_[_\u21a6_] : \u2200 {\u2113} (A : Set \u2113) (\u03c6 : I) (u : Partial \u03c6 A) \u2192 Set\u03c9\r\nA [ \u03c6 \u21a6 u ] = Sub A \u03c6 u\r\n```\r\n\r\nThere is just too little information in the documentation on how to use these or what are their purpose. Because `Sub` is built in, I can't gleam anything from it.\r\n\r\n```\r\ninc : \u2200 {\u2113} {A : Set \u2113} {\u03c6 : I} (u : A) \u2192 A [ \u03c6 \u21a6 (\u03bb _ \u2192 u) ]\r\nouc : \u2200 {\u2113} {A : Set \u2113} {\u03c6 : I} {u : Partial \u03c6 A} \u2192 A [ \u03c6 \u21a6 u ] \u2192 A\r\n```\r\n\r\nAlso these two do not seem to be in the library. Are they supposed to be builtins or is it possible to define them?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"a unfinnish work ...","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Docker fails when following instructions from DOCKER.md","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"images for readme.md","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Question: push menu from top/bottom","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Example or documentation on --index-url","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Missing readme how to use","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Add README.md\n\nadd a readme with config details","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Will the Error time be included if no errors are thrown?\n\nIf there is a better place to ask a 'How To' question for BenchMarkDotNet please tell me where.\r\n\r\nI was super excited when I completed my first benchmark and my project DataTiet.Net beat Entity Framework by a landslide:\r\n\r\n\r\nMethod | Mean | Error | StdDev\r\n-- | -- | -- | --\r\nLoadDTNSetting | 14.24 us | 0.0181 us | 0.0169 us\r\nLoadEFSetting | 195.93 us | 0.5839 us | 0.5462 us\r\n\r\n(simple test finding one record in a table by the primary key. If the EF Data Context creation was included, EF went to over 700).\r\n\r\nI am curious about one thing\r\n\r\nI see the error times listed, but I do not believe either product threw any errors. How is this number determined if no errors are thrown, or will it show zero if no errors are actually thrown?\r\n\r\nThanks. \r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# [DOC] Docker Command To Run the Container Uses Deprecated Flag\n\n## Report incorrect documentation\r\n\r\n**Location of incorrect documentation**\r\nhttps://rapids.ai/start.html\r\n\r\n**Describe the problems or issues found in the documentation**\r\n\r\n\r\nThe line:\r\n\r\n```\r\ndocker run --runtime=nvidia --rm -it -p 8888:8888 -p 8787:8787 -p 8786:8786 \\\r\nrapidsai/rapidsai:0.8-cuda10.0-runtime-ubuntu18.04-gcc7-py3.7\r\n```\r\n\r\nUses the ``--runtime=nvidia`` flag which is for the deprecated `nvidia-docker2` package and raises an _Unkown runtime_ error if you are instead using the `nvidia-container-toolkit` (https://github.com/NVIDIA/nvidia-docker).\r\n\r\n**Steps taken to verify documentation is incorrect**\r\nI ran the command with the flag and it quit with an Unknown runtime error. I downgraded my docker package back to one compatible with nvidia-docker2 (it wouldn't install with my Ubuntu 19.04 version) and installed nvidia-docker2 and it ran without error. Then I removed nvidia-docker2 and reinstalled docker-ce 19.03 and the nvidia-container-toolkit and re-ran the command without the flag and it launched the container without error.\r\n\r\n**Suggested fix for documentation**\r\n\r\nRemove the flag but note that users of the `nvidia-docker2` package might need to use it if they are using a version of docker prior to 19.03.\r\n\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Change how default config values are handed in the code\n\nThe most common way to handle default values for config options seems to be to attempt to look up the key in the toml, then assign a default value if that fails, something like `config.get_as_type(key).unwrap_or(default_value)`. There are several potential problems I can see with this approach:\r\n\r\n- It allows the user to typo keys in their config and not be warned about it (e.g. `use_colour`)\r\n- It allows the *programmer* to typo keys in their code and not be warned about it (granted this should usually be caught on testing/code review)\r\n- If the documentation and code have accidentally diverged, you have to go to the source code of the module to look up the correct default arguments.\r\n\r\nA possible alternative would be to create a config manager class that holds `enums` of all the config options, along with their default values. You would then be able to query it by asking it for the value of the enum, e.g. we would replace\r\n\r\n```\r\n let signed_config_min = module.config_value_i64(\"min_time\").unwrap_or(2);\r\n```\r\n\r\nwith \r\n\r\n```\r\n let signed_config_min = module.get_config_value(config::min_time);\r\n```\r\n\r\nWe could even start to encode things like implied arguments (if one value is specified in the config, then multiple other values are implicitly set to certain values), or incompatible arguments across modules (if doing A in one module, cannot do B in another) into this config manager, and give starship the ability to check the toml for correctness.\r\n\r\nAdvantages:\r\n - Code that accesses invalid configuration options will not compile\r\n - Since code now knows about all config options, it can check user's toml file for inconsistencies or invalid config values\r\n - Simplifies reading in of options in module code\r\n\r\nDisadvantages:\r\n - Large initial pain to switch over\r\n - Cannot immediately see default value from within module code\r\n - Potential unseen complexities in implementation\r\n\r\nWould this seem like a net positive for us, or does it seem closer to neutral for what we have at the moment?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Your first contribution\n\n### Introduction to GitHub flow\n\nNow that you're familiar with issues, let's use this issue to track your path to your first contribution.\n\nPeople use different workflows to contribute to software projects, but the simplest and most effective way to contribute on GitHub is the GitHub flow.\n\n:tv: [Video: Understanding the GitHub flow](https://www.youtube.com/watch?v=PBI2Rz-ZOxU)\n\n<hr>\n<h3 align=\"center\">Read below for next steps</h3>\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[docs] Windows Instructions need updating.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"mongoreplay missing from mongodb package","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Downgrade headings on index page\n\n**Why?**\r\n\r\n- Index page will typically have a `<h1>` in the index header\r\n- Each post will typically have their own `<h1>`\r\n- This will lead to more than one `<h1>` on the index page, which doesn't make sense semantically and may have some SEO implications\r\n\r\n**What?**\r\n\r\n- [ ] Convert `<h2> --> <h3>`, `<h1> --> <h2>` (in that order)\r\n\r\n- [ ] Update documentation to describe this weird behaviour (don't post example html code before the break) ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"datetime (convert_datetime_type) seems to add in extra milliseconds","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Fix AOF rewrite for types missing it","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Native library fails to load in docker container","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Avoid wrapping div if single child is provided","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Website","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Review docs for correctness\n\nNeed to verify that the docs are correct and match the code.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"power calculation with negative base","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"T3, Semantic Segmentation Project - vgg_data","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Debounce operator not working as expected ??\n\n fun main() = runBlocking<Unit> {\r\n flow {\r\n emit(1)\r\n delay(99)\r\n emit(2)\r\n delay(99)\r\n emit(3)\r\n delay(1001)\r\n emit(4)\r\n delay(1001)\r\n emit(5)\r\n }.debounce(1000).collect {\r\n println(it) // print 3, 5\r\n }\r\n\r\nAbove code print 3, 5\r\nI expected it to print 3, 4, 5\r\n\r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Problem with \"carriage return\" in Markdown Syntax","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Print test filename for easier \"go to file\" on integrated terminals like VS Code\n\nI suggest to add the filename (relative to the project root, or absolute if it's a trouble to make), in which the Assertion has failed, so it is easier to navigate to that file on integrated terminals like the Visual Studio Code Terminal (You can Alt+Click any absolute or relative file path and it will open in the same editor) \r\n\r\nActual: \r\n```bash\r\n> [email protected] unit-tests C:\\repos\\docs_gm\\docs_gm-core\r\n> alsatian \"./dist/tests/unit/**/*.spec.js\"\r\n\r\n Pass: 190 / 191\r\n Fail: 1 / 191\r\n Ignore: 0 / 191\r\n\r\n FAIL: ConfigOverriderFixture > override_output ( \"design\" )\r\n Expected undefined to be \\\"foo\\\".\r\n\r\n expected:\r\n \"foo\"\r\n actual:\r\n undefined\r\n```\r\n\r\nSuggested: \r\n```bash\r\n> [email protected] unit-tests C:\\repos\\docs_gm\\docs_gm-core\r\n> alsatian \"./dist/tests/unit/**/*.spec.js\"\r\n\r\n Pass: 190 / 191\r\n Fail: 1 / 191\r\n Ignore: 0 / 191\r\n\r\n FAIL: ConfigOverriderFixture > override_output ( \"design\" )\r\n at file: tests/unit/config/ConfigOverrider.spec.ts\r\n Expected undefined to be \\\"foo\\\".\r\n\r\n expected:\r\n \"foo\"\r\n actual:\r\n undefined\r\n```\r\n\r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Use db.changes( to update docs that are displayed\n\nhttps://pouchdb.com/guides/changes.html\r\n\r\nWill replace some of the save doc behavior as we won't need to manually update, delete, refresh, etc from those functions. ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"No Mapbox page shell","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update readme to reflect the new repo location","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"No long work with Vue 1.x since v.0.8.0","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update Docs for Managing API Resources","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# CVE-2019-10744 (High) detected in lodash-1.0.2.tgz\n\n## CVE-2019-10744 - High Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-1.0.2.tgz</b></p></summary>\n\n<p>A utility library delivering consistency, customization, performance, and extras.</p>\n<p>Library home page: <a href=\"https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz\">https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz</a></p>\n<p>Path to dependency file: /website/docs/package.json</p>\n<p>Path to vulnerable library: /tmp/git/website/docs/node_modules/lodash/package.json</p>\n<p>\n\nDependency Hierarchy:\n - gulp-3.9.1.tgz (Root Library)\n - vinyl-fs-0.3.14.tgz\n - glob-watcher-0.0.6.tgz\n - gaze-0.5.2.tgz\n - globule-0.1.0.tgz\n - :x: **lodash-1.0.2.tgz** (Vulnerable Library)\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nVersions of lodash lower than 4.17.12 are vulnerable to Prototype Pollution. The function defaultsDeep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.\n\n<p>Publish Date: 2019-07-26\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-10744>CVE-2019-10744</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: None\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://github.com/lodash/lodash/pull/4336/commits/a01e4fa727e7294cb7b2845570ba96b206926790\">https://github.com/lodash/lodash/pull/4336/commits/a01e4fa727e7294cb7b2845570ba96b206926790</a></p>\n<p>Release Date: 2019-07-08</p>\n<p>Fix Resolution: 4.17.12</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Running error on arm\n\nGood guys\r\n\u00a0\u00a0 I downloaded the source code and tried to cross-compile the unix directory (arm-linux-gcc). Specific steps are as follows:\r\n1. I tried to run configure, but I was given an error. After I modified the configure, I just managed to run it. This is the file I modified.\r\n #! /bin/sh\r\n# From configure.ac Revision: 1.60 .\r\n# Guess values for system-dependent variables and create Makefiles.\r\n# Generated by GNU Autoconf 2.69 for Snes9x 1.60.\r\n#\r\n#\r\n# Copyright (C) 1992-1996, 1998-2012 Free Software Foundation, Inc.\r\n#\r\n#\r\n# This configure script is free software; the Free Software Foundation\r\n# gives unlimited permission to copy, distribute and modify it.\r\n## -------------------- ##\r\n## M4sh Initialization. ##\r\n## -------------------- ##\r\n\r\n# Be more Bourne compatible\r\nDUALCASE=1; export DUALCASE # for MKS sh\r\nif test -n \"${ZSH_VERSION+set}\" && (emulate sh) >/dev/null 2>&1; then :\r\n emulate sh\r\n NULLCMD=:\r\n # Pre-4.2 versions of Zsh do word splitting on ${1+\"$@\"}, which\r\n # is contrary to our usage. Disable this feature.\r\n alias -g '${1+\"$@\"}'='\"$@\"'\r\n setopt NO_GLOB_SUBST\r\nelse\r\n case `(set -o) 2>/dev/null` in #(\r\n *posix*) :\r\n set -o posix ;; #(\r\n *) :\r\n ;;\r\nesac\r\nfi\r\n\r\n\r\nas_nl='\r\n'\r\nexport as_nl\r\n# Printing a long string crashes Solaris 7 /usr/bin/printf.\r\nas_echo='\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\'\r\nas_echo=$as_echo$as_echo$as_echo$as_echo$as_echo\r\nas_echo=$as_echo$as_echo$as_echo$as_echo$as_echo$as_echo\r\n# Prefer a ksh shell builtin over an external printf program on Solaris,\r\n# but without wasting forks for bash or zsh.\r\nif test -z \"$BASH_VERSION$ZSH_VERSION\" \\\r\n && (test \"X`print -r -- $as_echo`\" = \"X$as_echo\") 2>/dev/null; then\r\n as_echo='print -r --'\r\n as_echo_n='print -rn --'\r\nelif (test \"X`printf %s $as_echo`\" = \"X$as_echo\") 2>/dev/null; then\r\n as_echo='printf %s\\n'\r\n as_echo_n='printf %s'\r\nelse\r\n if test \"X`(/usr/ucb/echo -n -n $as_echo) 2>/dev/null`\" = \"X-n $as_echo\"; then\r\n as_echo_body='eval /usr/ucb/echo -n \"$1$as_nl\"'\r\n as_echo_n='/usr/ucb/echo -n'\r\n else\r\n as_echo_body='eval expr \"X$1\" : \"X\\\\(.*\\\\)\"'\r\n as_echo_n_body='eval\r\n arg=$1;\r\n case $arg in #(\r\n *\"$as_nl\"*)\r\n\texpr \"X$arg\" : \"X\\\\(.*\\\\)$as_nl\";\r\n\targ=`expr \"X$arg\" : \".*$as_nl\\\\(.*\\\\)\"`;;\r\n esac;\r\n expr \"X$arg\" : \"X\\\\(.*\\\\)\" | tr -d \"$as_nl\"\r\n '\r\n export as_echo_n_body\r\n as_echo_n='sh -c $as_echo_n_body as_echo'\r\n fi\r\n export as_echo_body\r\n as_echo='sh -c $as_echo_body as_echo'\r\nfi\r\n\r\n# The user is always right.\r\nif test \"${PATH_SEPARATOR+set}\" != set; then\r\n PATH_SEPARATOR=:\r\n (PATH='/bin;/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 && {\r\n (PATH='/bin:/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 ||\r\n PATH_SEPARATOR=';'\r\n }\r\nfi\r\n\r\n\r\n# IFS\r\n# We need space, tab and new line, in precisely that order. Quoting is\r\n# there to prevent editors from complaining about space-tab.\r\n# (If _AS_PATH_WALK were called with IFS unset, it would disable word\r\n# splitting by setting IFS to empty value.)\r\nIFS=\" \"\"\t$as_nl\"\r\n\r\n# Find who we are. Look in the path if we contain no directory separator.\r\nas_myself=\r\ncase $0 in #((\r\n *[\\\\/]* ) as_myself=$0 ;;\r\n *) as_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n test -r \"$as_dir/$0\" && as_myself=$as_dir/$0 && break\r\n done\r\nIFS=$as_save_IFS\r\n\r\n ;;\r\nesac\r\n# We did not find ourselves, most probably we were run as `sh COMMAND'\r\n# in which case we are not to be found in the path.\r\nif test \"x$as_myself\" = x; then\r\n as_myself=$0\r\nfi\r\nif test ! -f \"$as_myself\"; then\r\n $as_echo \"$as_myself: error: cannot find myself; rerun with an absolute file name\" >&2\r\n exit 1\r\nfi\r\n\r\n# Unset variables that we do not need and which cause bugs (e.g. in\r\n# pre-3.0 UWIN ksh). But do not cause bugs in bash 2.01; the \"|| exit 1\"\r\n# suppresses any \"Segmentation fault\" message there. '((' could\r\n# trigger a bug in pdksh 5.2.14.\r\nfor as_var in BASH_ENV ENV MAIL MAILPATH\r\ndo eval test x\\${$as_var+set} = xset \\\r\n && ( (unset $as_var) || exit 1) >/dev/null 2>&1 && unset $as_var || :\r\ndone\r\nPS1='$ '\r\nPS2='> '\r\nPS4='+ '\r\n\r\n# NLS nuisances.\r\nLC_ALL=C\r\nexport LC_ALL\r\nLANGUAGE=C\r\nexport LANGUAGE\r\n\r\n# CDPATH.\r\n(unset CDPATH) >/dev/null 2>&1 && unset CDPATH\r\n\r\n# Use a proper internal environment variable to ensure we don't fall\r\n # into an infinite loop, continuously re-executing ourselves.\r\n if test x\"${_as_can_reexec}\" != xno && test \"x$CONFIG_SHELL\" != x; then\r\n _as_can_reexec=no; export _as_can_reexec;\r\n # We cannot yet assume a decent shell, so we have to provide a\r\n# neutralization value for shells without unset; and this also\r\n# works around shells that cannot unset nonexistent variables.\r\n# Preserve -v and -x to the replacement shell.\r\nBASH_ENV=/dev/null\r\nENV=/dev/null\r\n(unset BASH_ENV) >/dev/null 2>&1 && unset BASH_ENV ENV\r\ncase $- in # ((((\r\n *v*x* | *x*v* ) as_opts=-vx ;;\r\n *v* ) as_opts=-v ;;\r\n *x* ) as_opts=-x ;;\r\n * ) as_opts= ;;\r\nesac\r\nexec $CONFIG_SHELL $as_opts \"$as_myself\" ${1+\"$@\"}\r\n# Admittedly, this is quite paranoid, since all the known shells bail\r\n# out after a failed `exec'.\r\n$as_echo \"$0: could not re-execute with $CONFIG_SHELL\" >&2\r\nas_fn_exit 255\r\n fi\r\n # We don't want this to propagate to other subprocesses.\r\n { _as_can_reexec=; unset _as_can_reexec;}\r\nif test \"x$CONFIG_SHELL\" = x; then\r\n as_bourne_compatible=\"if test -n \\\"\\${ZSH_VERSION+set}\\\" && (emulate sh) >/dev/null 2>&1; then :\r\n emulate sh\r\n NULLCMD=:\r\n # Pre-4.2 versions of Zsh do word splitting on \\${1+\\\"\\$@\\\"}, which\r\n # is contrary to our usage. Disable this feature.\r\n alias -g '\\${1+\\\"\\$@\\\"}'='\\\"\\$@\\\"'\r\n setopt NO_GLOB_SUBST\r\nelse\r\n case \\`(set -o) 2>/dev/null\\` in #(\r\n *posix*) :\r\n set -o posix ;; #(\r\n *) :\r\n ;;\r\nesac\r\nfi\r\n\"\r\n as_required=\"as_fn_return () { (exit \\$1); }\r\nas_fn_success () { as_fn_return 0; }\r\nas_fn_failure () { as_fn_return 1; }\r\nas_fn_ret_success () { return 0; }\r\nas_fn_ret_failure () { return 1; }\r\n\r\nexitcode=0\r\nas_fn_success || { exitcode=1; echo as_fn_success failed.; }\r\nas_fn_failure && { exitcode=1; echo as_fn_failure succeeded.; }\r\nas_fn_ret_success || { exitcode=1; echo as_fn_ret_success failed.; }\r\nas_fn_ret_failure && { exitcode=1; echo as_fn_ret_failure succeeded.; }\r\nif ( set x; as_fn_ret_success y && test x = \\\"\\$1\\\" ); then :\r\n\r\nelse\r\n exitcode=1; echo positional parameters were not saved.\r\nfi\r\ntest x\\$exitcode = x0 || exit 1\r\ntest -x / || exit 1\"\r\n as_suggested=\" as_lineno_1=\";as_suggested=$as_suggested$LINENO;as_suggested=$as_suggested\" as_lineno_1a=\\$LINENO\r\n as_lineno_2=\";as_suggested=$as_suggested$LINENO;as_suggested=$as_suggested\" as_lineno_2a=\\$LINENO\r\n eval 'test \\\"x\\$as_lineno_1'\\$as_run'\\\" != \\\"x\\$as_lineno_2'\\$as_run'\\\" &&\r\n test \\\"x\\`expr \\$as_lineno_1'\\$as_run' + 1\\`\\\" = \\\"x\\$as_lineno_2'\\$as_run'\\\"' || exit 1\r\ntest \\$(( 1 + 1 )) = 2 || exit 1\"\r\n if (eval \"$as_required\") 2>/dev/null; then :\r\n as_have_required=yes\r\nelse\r\n as_have_required=no\r\nfi\r\n if test x$as_have_required = xyes && (eval \"$as_suggested\") 2>/dev/null; then :\r\n\r\nelse\r\n as_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nas_found=false\r\nfor as_dir in /bin$PATH_SEPARATOR/usr/bin$PATH_SEPARATOR$PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n as_found=:\r\n case $as_dir in #(\r\n\t /*)\r\n\t for as_base in sh bash ksh sh5; do\r\n\t # Try only shells that exist, to save several forks.\r\n\t as_shell=$as_dir/$as_base\r\n\t if { test -f \"$as_shell\" || test -f \"$as_shell.exe\"; } &&\r\n\t\t { $as_echo \"$as_bourne_compatible\"\"$as_required\" | as_run=a \"$as_shell\"; } 2>/dev/null; then :\r\n CONFIG_SHELL=$as_shell as_have_required=yes\r\n\t\t if { $as_echo \"$as_bourne_compatible\"\"$as_suggested\" | as_run=a \"$as_shell\"; } 2>/dev/null; then :\r\n break 2\r\nfi\r\nfi\r\n\t done;;\r\n esac\r\n as_found=false\r\ndone\r\n$as_found || { if { test -f \"$SHELL\" || test -f \"$SHELL.exe\"; } &&\r\n\t { $as_echo \"$as_bourne_compatible\"\"$as_required\" | as_run=a \"$SHELL\"; } 2>/dev/null; then :\r\n CONFIG_SHELL=$SHELL as_have_required=yes\r\nfi; }\r\nIFS=$as_save_IFS\r\n\r\n\r\n if test \"x$CONFIG_SHELL\" != x; then :\r\n export CONFIG_SHELL\r\n # We cannot yet assume a decent shell, so we have to provide a\r\n# neutralization value for shells without unset; and this also\r\n# works around shells that cannot unset nonexistent variables.\r\n# Preserve -v and -x to the replacement shell.\r\nBASH_ENV=/dev/null\r\nENV=/dev/null\r\n(unset BASH_ENV) >/dev/null 2>&1 && unset BASH_ENV ENV\r\ncase $- in # ((((\r\n *v*x* | *x*v* ) as_opts=-vx ;;\r\n *v* ) as_opts=-v ;;\r\n *x* ) as_opts=-x ;;\r\n * ) as_opts= ;;\r\nesac\r\nexec $CONFIG_SHELL $as_opts \"$as_myself\" ${1+\"$@\"}\r\n# Admittedly, this is quite paranoid, since all the known shells bail\r\n# out after a failed `exec'.\r\n$as_echo \"$0: could not re-execute with $CONFIG_SHELL\" >&2\r\nexit 255\r\nfi\r\n\r\n if test x$as_have_required = xno; then :\r\n $as_echo \"$0: This script requires a shell more modern than all\"\r\n $as_echo \"$0: the shells that I found on your system.\"\r\n if test x${ZSH_VERSION+set} = xset ; then\r\n $as_echo \"$0: In particular, zsh $ZSH_VERSION has bugs and should\"\r\n $as_echo \"$0: be upgraded to zsh 4.3.4 or later.\"\r\n else\r\n $as_echo \"$0: Please tell [email protected] about your system,\r\n$0: including any error possibly output before this\r\n$0: message. Then install a modern shell, or manually run\r\n$0: the script under such a shell if you do have one.\"\r\n fi\r\n exit 1\r\nfi\r\nfi\r\nfi\r\nSHELL=${CONFIG_SHELL-/bin/sh}\r\nexport SHELL\r\n# Unset more variables known to interfere with behavior of common tools.\r\nCLICOLOR_FORCE= GREP_OPTIONS=\r\nunset CLICOLOR_FORCE GREP_OPTIONS\r\n\r\n## --------------------- ##\r\n## M4sh Shell Functions. ##\r\n## --------------------- ##\r\n# as_fn_unset VAR\r\n# ---------------\r\n# Portably unset VAR.\r\nas_fn_unset ()\r\n{\r\n { eval $1=; unset $1;}\r\n}\r\nas_unset=as_fn_unset\r\n\r\n# as_fn_set_status STATUS\r\n# -----------------------\r\n# Set $? to STATUS, without forking.\r\nas_fn_set_status ()\r\n{\r\n return $1\r\n} # as_fn_set_status\r\n\r\n# as_fn_exit STATUS\r\n# -----------------\r\n# Exit the shell with STATUS, even in a \"trap 0\" or \"set -e\" context.\r\nas_fn_exit ()\r\n{\r\n set +e\r\n as_fn_set_status $1\r\n exit $1\r\n} # as_fn_exit\r\n\r\n# as_fn_mkdir_p\r\n# -------------\r\n# Create \"$as_dir\" as a directory, including parents if necessary.\r\nas_fn_mkdir_p ()\r\n{\r\n\r\n case $as_dir in #(\r\n -*) as_dir=./$as_dir;;\r\n esac\r\n test -d \"$as_dir\" || eval $as_mkdir_p || {\r\n as_dirs=\r\n while :; do\r\n case $as_dir in #(\r\n *\\'*) as_qdir=`$as_echo \"$as_dir\" | sed \"s/'/'\\\\\\\\\\\\\\\\''/g\"`;; #'(\r\n *) as_qdir=$as_dir;;\r\n esac\r\n as_dirs=\"'$as_qdir' $as_dirs\"\r\n as_dir=`$as_dirname -- \"$as_dir\" ||\r\n$as_expr X\"$as_dir\" : 'X\\(.*[^/]\\)//*[^/][^/]*/*$' \\| \\\r\n\t X\"$as_dir\" : 'X\\(//\\)[^/]' \\| \\\r\n\t X\"$as_dir\" : 'X\\(//\\)$' \\| \\\r\n\t X\"$as_dir\" : 'X\\(/\\)' \\| . 2>/dev/null ||\r\n$as_echo X\"$as_dir\" |\r\n sed '/^X\\(.*[^/]\\)\\/\\/*[^/][^/]*\\/*$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\/\\)[^/].*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\/\\)$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\).*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t s/.*/./; q'`\r\n test -d \"$as_dir\" && break\r\n done\r\n test -z \"$as_dirs\" || eval \"mkdir $as_dirs\"\r\n } || test -d \"$as_dir\" || as_fn_error $? \"cannot create directory $as_dir\"\r\n\r\n\r\n} # as_fn_mkdir_p\r\n\r\n# as_fn_executable_p FILE\r\n# -----------------------\r\n# Test if FILE is an executable regular file.\r\nas_fn_executable_p ()\r\n{\r\n test -f \"$1\" && test -x \"$1\"\r\n} # as_fn_executable_p\r\n# as_fn_append VAR VALUE\r\n# ----------------------\r\n# Append the text in VALUE to the end of the definition contained in VAR. Take\r\n# advantage of any shell optimizations that allow amortized linear growth over\r\n# repeated appends, instead of the typical quadratic growth present in naive\r\n# implementations.\r\nif (eval \"as_var=1; as_var+=2; test x\\$as_var = x12\") 2>/dev/null; then :\r\n eval 'as_fn_append ()\r\n {\r\n eval $1+=\\$2\r\n }'\r\nelse\r\n as_fn_append ()\r\n {\r\n eval $1=\\$$1\\$2\r\n }\r\nfi # as_fn_append\r\n\r\n# as_fn_arith ARG...\r\n# ------------------\r\n# Perform arithmetic evaluation on the ARGs, and store the result in the\r\n# global $as_val. Take advantage of shells that can avoid forks. The arguments\r\n# must be portable across $(()) and expr.\r\nif (eval \"test \\$(( 1 + 1 )) = 2\") 2>/dev/null; then :\r\n eval 'as_fn_arith ()\r\n {\r\n as_val=$(( $* ))\r\n }'\r\nelse\r\n as_fn_arith ()\r\n {\r\n as_val=`expr \"$@\" || test $? -eq 1`\r\n }\r\nfi # as_fn_arith\r\n\r\n\r\n# as_fn_error STATUS ERROR [LINENO LOG_FD]\r\n# ----------------------------------------\r\n# Output \"`basename $0`: error: ERROR\" to stderr. If LINENO and LOG_FD are\r\n# provided, also output the error to LOG_FD, referencing LINENO. Then exit the\r\n# script with STATUS, using 1 if that was 0.\r\nas_fn_error ()\r\n{\r\n as_status=$1; test $as_status -eq 0 && as_status=1\r\n if test \"$4\"; then\r\n as_lineno=${as_lineno-\"$3\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: error: $2\" >&$4\r\n fi\r\n $as_echo \"$as_me: error: $2\" >&2\r\n as_fn_exit $as_status\r\n} # as_fn_error\r\n\r\nif expr a : '\\(a\\)' >/dev/null 2>&1 &&\r\n test \"X`expr 00001 : '.*\\(...\\)'`\" = X001; then\r\n as_expr=expr\r\nelse\r\n as_expr=false\r\nfi\r\n\r\nif (basename -- /) >/dev/null 2>&1 && test \"X`basename -- / 2>&1`\" = \"X/\"; then\r\n as_basename=basename\r\nelse\r\n as_basename=false\r\nfi\r\n\r\nif (as_dir=`dirname -- /` && test \"X$as_dir\" = X/) >/dev/null 2>&1; then\r\n as_dirname=dirname\r\nelse\r\n as_dirname=false\r\nfi\r\n\r\nas_me=`$as_basename -- \"$0\" ||\r\n$as_expr X/\"$0\" : '.*/\\([^/][^/]*\\)/*$' \\| \\\r\n\t X\"$0\" : 'X\\(//\\)$' \\| \\\r\n\t X\"$0\" : 'X\\(/\\)' \\| . 2>/dev/null ||\r\n$as_echo X/\"$0\" |\r\n sed '/^.*\\/\\([^/][^/]*\\)\\/*$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\/\\(\\/\\/\\)$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\/\\(\\/\\).*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t s/.*/./; q'`\r\n\r\n# Avoid depending upon Character Ranges.\r\nas_cr_letters='abcdefghijklmnopqrstuvwxyz'\r\nas_cr_LETTERS='ABCDEFGHIJKLMNOPQRSTUVWXYZ'\r\nas_cr_Letters=$as_cr_letters$as_cr_LETTERS\r\nas_cr_digits='0123456789'\r\nas_cr_alnum=$as_cr_Letters$as_cr_digits\r\n\r\n\r\n as_lineno_1=$LINENO as_lineno_1a=$LINENO\r\n as_lineno_2=$LINENO as_lineno_2a=$LINENO\r\n eval 'test \"x$as_lineno_1'$as_run'\" != \"x$as_lineno_2'$as_run'\" &&\r\n test \"x`expr $as_lineno_1'$as_run' + 1`\" = \"x$as_lineno_2'$as_run'\"' || {\r\n # Blame Lee E. McMahon (1931-1989) for sed's syntax. :-)\r\n sed -n '\r\n p\r\n /[$]LINENO/=\r\n ' <$as_myself |\r\n sed '\r\n s/[$]LINENO.*/&-/\r\n t lineno\r\n b\r\n :lineno\r\n N\r\n :loop\r\n s/[$]LINENO\\([^'$as_cr_alnum'_].*\\n\\)\\(.*\\)/\\2\\1\\2/\r\n t loop\r\n s/-\\n.*//\r\n ' >$as_me.lineno &&\r\n chmod +x \"$as_me.lineno\" ||\r\n { $as_echo \"$as_me: error: cannot create $as_me.lineno; rerun with a POSIX shell\" >&2; as_fn_exit 1; }\r\n\r\n # If we had to re-execute with $CONFIG_SHELL, we're ensured to have\r\n # already done that, so ensure we don't try to do so again and fall\r\n # in an infinite loop. This has already happened in practice.\r\n _as_can_reexec=no; export _as_can_reexec\r\n # Don't try to exec as it changes $[0], causing all sort of problems\r\n # (the dirname of $[0] is not the place where we might find the\r\n # original and so on. Autoconf is especially sensitive to this).\r\n . \"./$as_me.lineno\"\r\n # Exit status is that of the last command.\r\n exit\r\n}\r\n\r\nECHO_C= ECHO_N= ECHO_T=\r\ncase `echo -n x` in #(((((\r\n-n*)\r\n case `echo 'xy\\c'` in\r\n *c*) ECHO_T='\t';;\t# ECHO_T is single tab character.\r\n xy) ECHO_C='\\c';;\r\n *) echo `echo ksh88 bug on AIX 6.1` > /dev/null\r\n ECHO_T='\t';;\r\n esac;;\r\n*)\r\n ECHO_N='-n';;\r\nesac\r\n\r\nrm -f conf$$ conf$$.exe conf$$.file\r\nif test -d conf$$.dir; then\r\n rm -f conf$$.dir/conf$$.file\r\nelse\r\n rm -f conf$$.dir\r\n mkdir conf$$.dir 2>/dev/null\r\nfi\r\nif (echo >conf$$.file) 2>/dev/null; then\r\n if ln -s conf$$.file conf$$ 2>/dev/null; then\r\n as_ln_s='ln -s'\r\n # ... but there are two gotchas:\r\n # 1) On MSYS, both `ln -s file dir' and `ln file dir' fail.\r\n # 2) DJGPP < 2.04 has no symlinks; `ln -s' creates a wrapper executable.\r\n # In both cases, we have to default to `cp -pR'.\r\n ln -s conf$$.file conf$$.dir 2>/dev/null && test ! -f conf$$.exe ||\r\n as_ln_s='cp -pR'\r\n elif ln conf$$.file conf$$ 2>/dev/null; then\r\n as_ln_s=ln\r\n else\r\n as_ln_s='cp -pR'\r\n fi\r\nelse\r\n as_ln_s='cp -pR'\r\nfi\r\nrm -f conf$$ conf$$.exe conf$$.dir/conf$$.file conf$$.file\r\nrmdir conf$$.dir 2>/dev/null\r\n\r\nif mkdir -p . 2>/dev/null; then\r\n as_mkdir_p='mkdir -p \"$as_dir\"'\r\nelse\r\n test -d ./-p && rmdir ./-p\r\n as_mkdir_p=false\r\nfi\r\n\r\nas_test_x='test -x'\r\nas_executable_p=as_fn_executable_p\r\n\r\n# Sed expression to map a string onto a valid CPP name.\r\nas_tr_cpp=\"eval sed 'y%*$as_cr_letters%P$as_cr_LETTERS%;s%[^_$as_cr_alnum]%_%g'\"\r\n\r\n# Sed expression to map a string onto a valid variable name.\r\nas_tr_sh=\"eval sed 'y%*+%pp%;s%[^_$as_cr_alnum]%_%g'\"\r\n\r\n\r\ntest -n \"$DJDIR\" || exec 7<&0 </dev/null\r\nexec 6>&1\r\n\r\n# Name of the host.\r\n# hostname on some systems (SVR3.2, old GNU/Linux) returns a bogus exit status,\r\n# so uname gets run too.\r\nac_hostname=`(hostname || uname -n) 2>/dev/null | sed 1q`\r\n\r\n#\r\n# Initializations.\r\n#\r\nac_default_prefix=/usr/local\r\nac_clean_files=\r\nac_config_libobj_dir=.\r\nLIBOBJS=\r\ncross_compiling=no\r\nsubdirs=\r\nMFLAGS=\r\nMAKEFLAGS=\r\n\r\n# Identity of this package.\r\nPACKAGE_NAME='Snes9x'\r\nPACKAGE_TARNAME='snes9x'\r\nPACKAGE_VERSION='1.60'\r\nPACKAGE_STRING='Snes9x 1.60'\r\nPACKAGE_BUGREPORT=''\r\nPACKAGE_URL=''\r\n\r\nac_unique_file=\"unix.cpp\"\r\n# Factoring default headers for most tests.\r\nac_includes_default=\"\\\r\n#include <stdio.h>\r\n#ifdef HAVE_SYS_TYPES_H\r\n# include <sys/types.h>\r\n#endif\r\n#ifdef HAVE_SYS_STAT_H\r\n# include <sys/stat.h>\r\n#endif\r\n#ifdef STDC_HEADERS\r\n# include <stdlib.h>\r\n# include <stddef.h>\r\n#else\r\n# ifdef HAVE_STDLIB_H\r\n# include <stdlib.h>\r\n# endif\r\n#endif\r\n#ifdef HAVE_STRING_H\r\n# if !defined STDC_HEADERS && defined HAVE_MEMORY_H\r\n# include <memory.h>\r\n# endif\r\n# include <string.h>\r\n#endif\r\n#ifdef HAVE_STRINGS_H\r\n# include <strings.h>\r\n#endif\r\n#ifdef HAVE_INTTYPES_H\r\n# include <inttypes.h>\r\n#endif\r\n#ifdef HAVE_STDINT_H\r\n# include <stdint.h>\r\n#endif\r\n#ifdef HAVE_UNISTD_H\r\n# include <unistd.h>\r\n#endif\"\r\n\r\nac_subst_vars='LTLIBOBJS\r\nLIBOBJS\r\nS9X_SYSTEM_ZIP\r\nS9XJMA\r\nS9XZIP\r\nS9XNETPLAY\r\nS9XDEBUGGER\r\nS9XXVIDEO\r\nS9XLIBS\r\nS9XDEFS\r\nS9XFLGS\r\nX_EXTRA_LIBS\r\nX_LIBS\r\nX_PRE_LIBS\r\nX_CFLAGS\r\nXMKMF\r\nSYSTEM_ZIP_LIBS\r\nSYSTEM_ZIP_CFLAGS\r\nPKG_CONFIG_LIBDIR\r\nPKG_CONFIG_PATH\r\nPKG_CONFIG\r\nEGREP\r\nGREP\r\nCXXCPP\r\nac_ct_CXX\r\nCXXFLAGS\r\nCXX\r\nOBJEXT\r\nEXEEXT\r\nac_ct_CC\r\nCPPFLAGS\r\nLDFLAGS\r\nCFLAGS\r\nCC\r\ntarget_os\r\ntarget_vendor\r\ntarget_cpu\r\ntarget\r\nhost_os\r\nhost_vendor\r\nhost_cpu\r\nhost\r\nbuild_os\r\nbuild_vendor\r\nbuild_cpu\r\nbuild\r\ntarget_alias\r\nhost_alias\r\nbuild_alias\r\nLIBS\r\nECHO_T\r\nECHO_N\r\nECHO_C\r\nDEFS\r\nmandir\r\nlocaledir\r\nlibdir\r\npsdir\r\npdfdir\r\ndvidir\r\nhtmldir\r\ninfodir\r\ndocdir\r\noldincludedir\r\nincludedir\r\nlocalstatedir\r\nsharedstatedir\r\nsysconfdir\r\ndatadir\r\ndatarootdir\r\nlibexecdir\r\nsbindir\r\nbindir\r\nprogram_transform_name\r\nprefix\r\nexec_prefix\r\nPACKAGE_URL\r\nPACKAGE_BUGREPORT\r\nPACKAGE_STRING\r\nPACKAGE_VERSION\r\nPACKAGE_TARNAME\r\nPACKAGE_NAME\r\nPATH_SEPARATOR\r\nSHELL'\r\nac_subst_files=''\r\nac_user_opts='\r\nenable_option_checking\r\nenable_debug\r\nenable_mtune\r\nenable_sse41\r\nenable_avx2\r\nenable_neon\r\nenable_gamepad\r\nenable_debugger\r\nenable_netplay\r\nenable_gzip\r\nenable_zip\r\nwith_system_zip\r\nenable_jma\r\nenable_screenshot\r\nwith_x\r\nenable_xvideo\r\nenable_xinerama\r\nenable_sound\r\n'\r\n ac_precious_vars='build_alias\r\nhost_alias\r\ntarget_alias\r\nCC\r\nCFLAGS\r\nLDFLAGS\r\nLIBS\r\nCPPFLAGS\r\nCXX\r\nCXXFLAGS\r\nCCC\r\nCXXCPP\r\nPKG_CONFIG\r\nPKG_CONFIG_PATH\r\nPKG_CONFIG_LIBDIR\r\nSYSTEM_ZIP_CFLAGS\r\nSYSTEM_ZIP_LIBS\r\nXMKMF'\r\n\r\n\r\n# Initialize some variables set by options.\r\nac_init_help=\r\nac_init_version=false\r\nac_unrecognized_opts=\r\nac_unrecognized_sep=\r\n# The variables have the same names as the options, with\r\n# dashes changed to underlines.\r\ncache_file=/dev/null\r\nexec_prefix=NONE\r\nno_create=\r\nno_recursion=\r\nprefix=NONE\r\nprogram_prefix=NONE\r\nprogram_suffix=NONE\r\nprogram_transform_name=s,x,x,\r\nsilent=\r\nsite=\r\nsrcdir=\r\nverbose=\r\nx_includes=NONE\r\nx_libraries=NONE\r\n\r\n# Installation directory options.\r\n# These are left unexpanded so users can \"make install exec_prefix=/foo\"\r\n# and all the variables that are supposed to be based on exec_prefix\r\n# by default will actually change.\r\n# Use braces instead of parens because sh, perl, etc. also accept them.\r\n# (The list follows the same order as the GNU Coding Standards.)\r\nbindir='${exec_prefix}/bin'\r\nsbindir='${exec_prefix}/sbin'\r\nlibexecdir='${exec_prefix}/libexec'\r\ndatarootdir='${prefix}/share'\r\ndatadir='${datarootdir}'\r\nsysconfdir='${prefix}/etc'\r\nsharedstatedir='${prefix}/com'\r\nlocalstatedir='${prefix}/var'\r\nincludedir='${prefix}/include'\r\noldincludedir='/usr/include'\r\ndocdir='${datarootdir}/doc/${PACKAGE_TARNAME}'\r\ninfodir='${datarootdir}/info'\r\nhtmldir='${docdir}'\r\ndvidir='${docdir}'\r\npdfdir='${docdir}'\r\npsdir='${docdir}'\r\nlibdir='${exec_prefix}/lib'\r\nlocaledir='${datarootdir}/locale'\r\nmandir='${datarootdir}/man'\r\n\r\nac_prev=\r\nac_dashdash=\r\nfor ac_option\r\ndo\r\n # If the previous option needs an argument, assign it.\r\n if test -n \"$ac_prev\"; then\r\n eval $ac_prev=\\$ac_option\r\n ac_prev=\r\n continue\r\n fi\r\n\r\n case $ac_option in\r\n *=?*) ac_optarg=`expr \"X$ac_option\" : '[^=]*=\\(.*\\)'` ;;\r\n *=) ac_optarg= ;;\r\n *) ac_optarg=yes ;;\r\n esac\r\n\r\n # Accept the important Cygnus configure options, so we can diagnose typos.\r\n\r\n case $ac_dashdash$ac_option in\r\n --)\r\n ac_dashdash=yes ;;\r\n\r\n -bindir | --bindir | --bindi | --bind | --bin | --bi)\r\n ac_prev=bindir ;;\r\n -bindir=* | --bindir=* | --bindi=* | --bind=* | --bin=* | --bi=*)\r\n bindir=$ac_optarg ;;\r\n\r\n -build | --build | --buil | --bui | --bu)\r\n ac_prev=build_alias ;;\r\n -build=* | --build=* | --buil=* | --bui=* | --bu=*)\r\n build_alias=$ac_optarg ;;\r\n\r\n -cache-file | --cache-file | --cache-fil | --cache-fi \\\r\n | --cache-f | --cache- | --cache | --cach | --cac | --ca | --c)\r\n ac_prev=cache_file ;;\r\n -cache-file=* | --cache-file=* | --cache-fil=* | --cache-fi=* \\\r\n | --cache-f=* | --cache-=* | --cache=* | --cach=* | --cac=* | --ca=* | --c=*)\r\n cache_file=$ac_optarg ;;\r\n\r\n --config-cache | -C)\r\n cache_file=config.cache ;;\r\n\r\n -datadir | --datadir | --datadi | --datad)\r\n ac_prev=datadir ;;\r\n -datadir=* | --datadir=* | --datadi=* | --datad=*)\r\n datadir=$ac_optarg ;;\r\n\r\n -datarootdir | --datarootdir | --datarootdi | --datarootd | --dataroot \\\r\n | --dataroo | --dataro | --datar)\r\n ac_prev=datarootdir ;;\r\n -datarootdir=* | --datarootdir=* | --datarootdi=* | --datarootd=* \\\r\n | --dataroot=* | --dataroo=* | --dataro=* | --datar=*)\r\n datarootdir=$ac_optarg ;;\r\n\r\n -disable-* | --disable-*)\r\n ac_useropt=`expr \"x$ac_option\" : 'x-*disable-\\(.*\\)'`\r\n # Reject names that are not valid shell variable names.\r\n expr \"x$ac_useropt\" : \".*[^-+._$as_cr_alnum]\" >/dev/null &&\r\n as_fn_error $? \"invalid feature name: $ac_useropt\"\r\n ac_useropt_orig=$ac_useropt\r\n ac_useropt=`$as_echo \"$ac_useropt\" | sed 's/[-+.]/_/g'`\r\n case $ac_user_opts in\r\n *\"\r\n\"enable_$ac_useropt\"\r\n\"*) ;;\r\n *) ac_unrecognized_opts=\"$ac_unrecognized_opts$ac_unrecognized_sep--disable-$ac_useropt_orig\"\r\n\t ac_unrecognized_sep=', ';;\r\n esac\r\n eval enable_$ac_useropt=no ;;\r\n\r\n -docdir | --docdir | --docdi | --doc | --do)\r\n ac_prev=docdir ;;\r\n -docdir=* | --docdir=* | --docdi=* | --doc=* | --do=*)\r\n docdir=$ac_optarg ;;\r\n\r\n -dvidir | --dvidir | --dvidi | --dvid | --dvi | --dv)\r\n ac_prev=dvidir ;;\r\n -dvidir=* | --dvidir=* | --dvidi=* | --dvid=* | --dvi=* | --dv=*)\r\n dvidir=$ac_optarg ;;\r\n\r\n -enable-* | --enable-*)\r\n ac_useropt=`expr \"x$ac_option\" : 'x-*enable-\\([^=]*\\)'`\r\n # Reject names that are not valid shell variable names.\r\n expr \"x$ac_useropt\" : \".*[^-+._$as_cr_alnum]\" >/dev/null &&\r\n as_fn_error $? \"invalid feature name: $ac_useropt\"\r\n ac_useropt_orig=$ac_useropt\r\n ac_useropt=`$as_echo \"$ac_useropt\" | sed 's/[-+.]/_/g'`\r\n case $ac_user_opts in\r\n *\"\r\n\"enable_$ac_useropt\"\r\n\"*) ;;\r\n *) ac_unrecognized_opts=\"$ac_unrecognized_opts$ac_unrecognized_sep--enable-$ac_useropt_orig\"\r\n\t ac_unrecognized_sep=', ';;\r\n esac\r\n eval enable_$ac_useropt=\\$ac_optarg ;;\r\n\r\n -exec-prefix | --exec_prefix | --exec-prefix | --exec-prefi \\\r\n | --exec-pref | --exec-pre | --exec-pr | --exec-p | --exec- \\\r\n | --exec | --exe | --ex)\r\n ac_prev=exec_prefix ;;\r\n -exec-prefix=* | --exec_prefix=* | --exec-prefix=* | --exec-prefi=* \\\r\n | --exec-pref=* | --exec-pre=* | --exec-pr=* | --exec-p=* | --exec-=* \\\r\n | --exec=* | --exe=* | --ex=*)\r\n exec_prefix=$ac_optarg ;;\r\n\r\n -gas | --gas | --ga | --g)\r\n # Obsolete; use --with-gas.\r\n with_gas=yes ;;\r\n\r\n -help | --help | --hel | --he | -h)\r\n ac_init_help=long ;;\r\n -help=r* | --help=r* | --hel=r* | --he=r* | -hr*)\r\n ac_init_help=recursive ;;\r\n -help=s* | --help=s* | --hel=s* | --he=s* | -hs*)\r\n ac_init_help=short ;;\r\n\r\n -host | --host | --hos | --ho)\r\n ac_prev=host_alias ;;\r\n -host=* | --host=* | --hos=* | --ho=*)\r\n host_alias=$ac_optarg ;;\r\n\r\n -htmldir | --htmldir | --htmldi | --htmld | --html | --htm | --ht)\r\n ac_prev=htmldir ;;\r\n -htmldir=* | --htmldir=* | --htmldi=* | --htmld=* | --html=* | --htm=* \\\r\n | --ht=*)\r\n htmldir=$ac_optarg ;;\r\n\r\n -includedir | --includedir | --includedi | --included | --include \\\r\n | --includ | --inclu | --incl | --inc)\r\n ac_prev=includedir ;;\r\n -includedir=* | --includedir=* | --includedi=* | --included=* | --include=* \\\r\n | --includ=* | --inclu=* | --incl=* | --inc=*)\r\n includedir=$ac_optarg ;;\r\n\r\n -infodir | --infodir | --infodi | --infod | --info | --inf)\r\n ac_prev=infodir ;;\r\n -infodir=* | --infodir=* | --infodi=* | --infod=* | --info=* | --inf=*)\r\n infodir=$ac_optarg ;;\r\n\r\n -libdir | --libdir | --libdi | --libd)\r\n ac_prev=libdir ;;\r\n -libdir=* | --libdir=* | --libdi=* | --libd=*)\r\n libdir=$ac_optarg ;;\r\n\r\n -libexecdir | --libexecdir | --libexecdi | --libexecd | --libexec \\\r\n | --libexe | --libex | --libe)\r\n ac_prev=libexecdir ;;\r\n -libexecdir=* | --libexecdir=* | --libexecdi=* | --libexecd=* | --libexec=* \\\r\n | --libexe=* | --libex=* | --libe=*)\r\n libexecdir=$ac_optarg ;;\r\n\r\n -localedir | --localedir | --localedi | --localed | --locale)\r\n ac_prev=localedir ;;\r\n -localedir=* | --localedir=* | --localedi=* | --localed=* | --locale=*)\r\n localedir=$ac_optarg ;;\r\n\r\n -localstatedir | --localstatedir | --localstatedi | --localstated \\\r\n | --localstate | --localstat | --localsta | --localst | --locals)\r\n ac_prev=localstatedir ;;\r\n -localstatedir=* | --localstatedir=* | --localstatedi=* | --localstated=* \\\r\n | --localstate=* | --localstat=* | --localsta=* | --localst=* | --locals=*)\r\n localstatedir=$ac_optarg ;;\r\n\r\n -mandir | --mandir | --mandi | --mand | --man | --ma | --m)\r\n ac_prev=mandir ;;\r\n -mandir=* | --mandir=* | --mandi=* | --mand=* | --man=* | --ma=* | --m=*)\r\n mandir=$ac_optarg ;;\r\n\r\n -nfp | --nfp | --nf)\r\n # Obsolete; use --without-fp.\r\n with_fp=no ;;\r\n\r\n -no-create | --no-create | --no-creat | --no-crea | --no-cre \\\r\n | --no-cr | --no-c | -n)\r\n no_create=yes ;;\r\n\r\n -no-recursion | --no-recursion | --no-recursio | --no-recursi \\\r\n | --no-recurs | --no-recur | --no-recu | --no-rec | --no-re | --no-r)\r\n no_recursion=yes ;;\r\n\r\n -oldincludedir | --oldincludedir | --oldincludedi | --oldincluded \\\r\n | --oldinclude | --oldinclud | --oldinclu | --oldincl | --oldinc \\\r\n | --oldin | --oldi | --old | --ol | --o)\r\n ac_prev=oldincludedir ;;\r\n -oldincludedir=* | --oldincludedir=* | --oldincludedi=* | --oldincluded=* \\\r\n | --oldinclude=* | --oldinclud=* | --oldinclu=* | --oldincl=* | --oldinc=* \\\r\n | --oldin=* | --oldi=* | --old=* | --ol=* | --o=*)\r\n oldincludedir=$ac_optarg ;;\r\n\r\n -prefix | --prefix | --prefi | --pref | --pre | --pr | --p)\r\n ac_prev=prefix ;;\r\n -prefix=* | --prefix=* | --prefi=* | --pref=* | --pre=* | --pr=* | --p=*)\r\n prefix=$ac_optarg ;;\r\n\r\n -program-prefix | --program-prefix | --program-prefi | --program-pref \\\r\n | --program-pre | --program-pr | --program-p)\r\n ac_prev=program_prefix ;;\r\n -program-prefix=* | --program-prefix=* | --program-prefi=* \\\r\n | --program-pref=* | --program-pre=* | --program-pr=* | --program-p=*)\r\n program_prefix=$ac_optarg ;;\r\n\r\n -program-suffix | --program-suffix | --program-suffi | --program-suff \\\r\n | --program-suf | --program-su | --program-s)\r\n ac_prev=program_suffix ;;\r\n -program-suffix=* | --program-suffix=* | --program-suffi=* \\\r\n | --program-suff=* | --program-suf=* | --program-su=* | --program-s=*)\r\n program_suffix=$ac_optarg ;;\r\n\r\n -program-transform-name | --program-transform-name \\\r\n | --program-transform-nam | --program-transform-na \\\r\n | --program-transform-n | --program-transform- \\\r\n | --program-transform | --program-transfor \\\r\n | --program-transfo | --program-transf \\\r\n | --program-trans | --program-tran \\\r\n | --progr-tra | --program-tr | --program-t)\r\n ac_prev=program_transform_name ;;\r\n -program-transform-name=* | --program-transform-name=* \\\r\n | --program-transform-nam=* | --program-transform-na=* \\\r\n | --program-transform-n=* | --program-transform-=* \\\r\n | --program-transform=* | --program-transfor=* \\\r\n | --program-transfo=* | --program-transf=* \\\r\n | --program-trans=* | --program-tran=* \\\r\n | --progr-tra=* | --program-tr=* | --program-t=*)\r\n program_transform_name=$ac_optarg ;;\r\n\r\n -pdfdir | --pdfdir | --pdfdi | --pdfd | --pdf | --pd)\r\n ac_prev=pdfdir ;;\r\n -pdfdir=* | --pdfdir=* | --pdfdi=* | --pdfd=* | --pdf=* | --pd=*)\r\n pdfdir=$ac_optarg ;;\r\n\r\n -psdir | --psdir | --psdi | --psd | --ps)\r\n ac_prev=psdir ;;\r\n -psdir=* | --psdir=* | --psdi=* | --psd=* | --ps=*)\r\n psdir=$ac_optarg ;;\r\n\r\n -q | -quiet | --quiet | --quie | --qui | --qu | --q \\\r\n | -silent | --silent | --silen | --sile | --sil)\r\n silent=yes ;;\r\n\r\n -sbindir | --sbindir | --sbindi | --sbind | --sbin | --sbi | --sb)\r\n ac_prev=sbindir ;;\r\n -sbindir=* | --sbindir=* | --sbindi=* | --sbind=* | --sbin=* \\\r\n | --sbi=* | --sb=*)\r\n sbindir=$ac_optarg ;;\r\n\r\n -sharedstatedir | --sharedstatedir | --sharedstatedi \\\r\n | --sharedstated | --sharedstate | --sharedstat | --sharedsta \\\r\n | --sharedst | --shareds | --shared | --share | --shar \\\r\n | --sha | --sh)\r\n ac_prev=sharedstatedir ;;\r\n -sharedstatedir=* | --sharedstatedir=* | --sharedstatedi=* \\\r\n | --sharedstated=* | --sharedstate=* | --sharedstat=* | --sharedsta=* \\\r\n | --sharedst=* | --shareds=* | --shared=* | --share=* | --shar=* \\\r\n | --sha=* | --sh=*)\r\n sharedstatedir=$ac_optarg ;;\r\n\r\n -site | --site | --sit)\r\n ac_prev=site ;;\r\n -site=* | --site=* | --sit=*)\r\n site=$ac_optarg ;;\r\n\r\n -srcdir | --srcdir | --srcdi | --srcd | --src | --sr)\r\n ac_prev=srcdir ;;\r\n -srcdir=* | --srcdir=* | --srcdi=* | --srcd=* | --src=* | --sr=*)\r\n srcdir=$ac_optarg ;;\r\n\r\n -sysconfdir | --sysconfdir | --sysconfdi | --sysconfd | --sysconf \\\r\n | --syscon | --sysco | --sysc | --sys | --sy)\r\n ac_prev=sysconfdir ;;\r\n -sysconfdir=* | --sysconfdir=* | --sysconfdi=* | --sysconfd=* | --sysconf=* \\\r\n | --syscon=* | --sysco=* | --sysc=* | --sys=* | --sy=*)\r\n sysconfdir=$ac_optarg ;;\r\n\r\n -target | --target | --targe | --targ | --tar | --ta | --t)\r\n ac_prev=target_alias ;;\r\n -target=* | --target=* | --targe=* | --targ=* | --tar=* | --ta=* | --t=*)\r\n target_alias=$ac_optarg ;;\r\n\r\n -v | -verbose | --verbose | --verbos | --verbo | --verb)\r\n verbose=yes ;;\r\n\r\n -version | --version | --versio | --versi | --vers | -V)\r\n ac_init_version=: ;;\r\n\r\n -with-* | --with-*)\r\n ac_useropt=`expr \"x$ac_option\" : 'x-*with-\\([^=]*\\)'`\r\n # Reject names that are not valid shell variable names.\r\n expr \"x$ac_useropt\" : \".*[^-+._$as_cr_alnum]\" >/dev/null &&\r\n as_fn_error $? \"invalid package name: $ac_useropt\"\r\n ac_useropt_orig=$ac_useropt\r\n ac_useropt=`$as_echo \"$ac_useropt\" | sed 's/[-+.]/_/g'`\r\n case $ac_user_opts in\r\n *\"\r\n\"with_$ac_useropt\"\r\n\"*) ;;\r\n *) ac_unrecognized_opts=\"$ac_unrecognized_opts$ac_unrecognized_sep--with-$ac_useropt_orig\"\r\n\t ac_unrecognized_sep=', ';;\r\n esac\r\n eval with_$ac_useropt=\\$ac_optarg ;;\r\n\r\n -without-* | --without-*)\r\n ac_useropt=`expr \"x$ac_option\" : 'x-*without-\\(.*\\)'`\r\n # Reject names that are not valid shell variable names.\r\n expr \"x$ac_useropt\" : \".*[^-+._$as_cr_alnum]\" >/dev/null &&\r\n as_fn_error $? \"invalid package name: $ac_useropt\"\r\n ac_useropt_orig=$ac_useropt\r\n ac_useropt=`$as_echo \"$ac_useropt\" | sed 's/[-+.]/_/g'`\r\n case $ac_user_opts in\r\n *\"\r\n\"with_$ac_useropt\"\r\n\"*) ;;\r\n *) ac_unrecognized_opts=\"$ac_unrecognized_opts$ac_unrecognized_sep--without-$ac_useropt_orig\"\r\n\t ac_unrecognized_sep=', ';;\r\n esac\r\n eval with_$ac_useropt=no ;;\r\n\r\n --x)\r\n # Obsolete; use --with-x.\r\n with_x=yes ;;\r\n\r\n -x-includes | --x-includes | --x-include | --x-includ | --x-inclu \\\r\n | --x-incl | --x-inc | --x-in | --x-i)\r\n ac_prev=x_includes ;;\r\n -x-includes=* | --x-includes=* | --x-include=* | --x-includ=* | --x-inclu=* \\\r\n | --x-incl=* | --x-inc=* | --x-in=* | --x-i=*)\r\n x_includes=$ac_optarg ;;\r\n\r\n -x-libraries | --x-libraries | --x-librarie | --x-librari \\\r\n | --x-librar | --x-libra | --x-libr | --x-lib | --x-li | --x-l)\r\n ac_prev=x_libraries ;;\r\n -x-libraries=* | --x-libraries=* | --x-librarie=* | --x-librari=* \\\r\n | --x-librar=* | --x-libra=* | --x-libr=* | --x-lib=* | --x-li=* | --x-l=*)\r\n x_libraries=$ac_optarg ;;\r\n\r\n -*) as_fn_error $? \"unrecognized option: \\`$ac_option'\r\nTry \\`$0 --help' for more information\"\r\n ;;\r\n\r\n *=*)\r\n ac_envvar=`expr \"x$ac_option\" : 'x\\([^=]*\\)='`\r\n # Reject names that are not valid shell variable names.\r\n case $ac_envvar in #(\r\n '' | [0-9]* | *[!_$as_cr_alnum]* )\r\n as_fn_error $? \"invalid variable name: \\`$ac_envvar'\" ;;\r\n esac\r\n eval $ac_envvar=\\$ac_optarg\r\n export $ac_envvar ;;\r\n\r\n *)\r\n # FIXME: should be removed in autoconf 3.0.\r\n $as_echo \"$as_me: WARNING: you should use --build, --host, --target\" >&2\r\n expr \"x$ac_option\" : \".*[^-._$as_cr_alnum]\" >/dev/null &&\r\n $as_echo \"$as_me: WARNING: invalid host type: $ac_option\" >&2\r\n : \"${build_alias=$ac_option} ${host_alias=$ac_option} ${target_alias=$ac_option}\"\r\n ;;\r\n\r\n esac\r\ndone\r\n\r\nif test -n \"$ac_prev\"; then\r\n ac_option=--`echo $ac_prev | sed 's/_/-/g'`\r\n as_fn_error $? \"missing argument to $ac_option\"\r\nfi\r\n\r\nif test -n \"$ac_unrecognized_opts\"; then\r\n case $enable_option_checking in\r\n no) ;;\r\n fatal) as_fn_error $? \"unrecognized options: $ac_unrecognized_opts\" ;;\r\n *) $as_echo \"$as_me: WARNING: unrecognized options: $ac_unrecognized_opts\" >&2 ;;\r\n esac\r\nfi\r\n\r\n# Check all directory arguments for consistency.\r\nfor ac_var in\texec_prefix prefix bindir sbindir libexecdir datarootdir \\\r\n\t\tdatadir sysconfdir sharedstatedir localstatedir includedir \\\r\n\t\toldincludedir docdir infodir htmldir dvidir pdfdir psdir \\\r\n\t\tlibdir localedir mandir\r\ndo\r\n eval ac_val=\\$$ac_var\r\n # Remove trailing slashes.\r\n case $ac_val in\r\n */ )\r\n ac_val=`expr \"X$ac_val\" : 'X\\(.*[^/]\\)' \\| \"X$ac_val\" : 'X\\(.*\\)'`\r\n eval $ac_var=\\$ac_val;;\r\n esac\r\n # Be sure to have absolute directory names.\r\n case $ac_val in\r\n [\\\\/$]* | ?:[\\\\/]* ) continue;;\r\n NONE | '' ) case $ac_var in *prefix ) continue;; esac;;\r\n esac\r\n as_fn_error $? \"expected an absolute directory name for --$ac_var: $ac_val\"\r\ndone\r\n\r\n# There might be people who depend on the old broken behavior: `$host'\r\n# used to hold the argument of --host etc.\r\n# FIXME: To remove some day.\r\nbuild=$build_alias\r\nhost=$host_alias\r\ntarget=$target_alias\r\n\r\n# FIXME: To remove some day.\r\nif test \"x$host_alias\" != x; then\r\n if test \"x$build_alias\" = x; then\r\n cross_compiling=maybe\r\n elif test \"x$build_alias\" != \"x$host_alias\"; then\r\n cross_compiling=yes\r\n fi\r\nfi\r\n\r\nac_tool_prefix=\r\ntest -n \"$host_alias\" && ac_tool_prefix=$host_alias-\r\n\r\ntest \"$silent\" = yes && exec 6>/dev/null\r\n\r\n\r\nac_pwd=`pwd` && test -n \"$ac_pwd\" &&\r\nac_ls_di=`ls -di .` &&\r\nac_pwd_ls_di=`cd \"$ac_pwd\" && ls -di .` ||\r\n as_fn_error $? \"working directory cannot be determined\"\r\ntest \"X$ac_ls_di\" = \"X$ac_pwd_ls_di\" ||\r\n as_fn_error $? \"pwd does not report name of working directory\"\r\n\r\n\r\n# Find the source files, if location was not specified.\r\nif test -z \"$srcdir\"; then\r\n ac_srcdir_defaulted=yes\r\n # Try the directory containing this script, then the parent directory.\r\n ac_confdir=`$as_dirname -- \"$as_myself\" ||\r\n$as_expr X\"$as_myself\" : 'X\\(.*[^/]\\)//*[^/][^/]*/*$' \\| \\\r\n\t X\"$as_myself\" : 'X\\(//\\)[^/]' \\| \\\r\n\t X\"$as_myself\" : 'X\\(//\\)$' \\| \\\r\n\t X\"$as_myself\" : 'X\\(/\\)' \\| . 2>/dev/null ||\r\n$as_echo X\"$as_myself\" |\r\n sed '/^X\\(.*[^/]\\)\\/\\/*[^/][^/]*\\/*$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\/\\)[^/].*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\/\\)$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\).*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t s/.*/./; q'`\r\n srcdir=$ac_confdir\r\n if test ! -r \"$srcdir/$ac_unique_file\"; then\r\n srcdir=..\r\n fi\r\nelse\r\n ac_srcdir_defaulted=no\r\nfi\r\nif test ! -r \"$srcdir/$ac_unique_file\"; then\r\n test \"$ac_srcdir_defaulted\" = yes && srcdir=\"$ac_confdir or ..\"\r\n as_fn_error $? \"cannot find sources ($ac_unique_file) in $srcdir\"\r\nfi\r\nac_msg=\"sources are in $srcdir, but \\`cd $srcdir' does not work\"\r\nac_abs_confdir=`(\r\n\tcd \"$srcdir\" && test -r \"./$ac_unique_file\" || as_fn_error $? \"$ac_msg\"\r\n\tpwd)`\r\n# When building in place, set srcdir=.\r\nif test \"$ac_abs_confdir\" = \"$ac_pwd\"; then\r\n srcdir=.\r\nfi\r\n# Remove unnecessary trailing slashes from srcdir.\r\n# Double slashes in file names in object file debugging info\r\n# mess up M-x gdb in Emacs.\r\ncase $srcdir in\r\n*/) srcdir=`expr \"X$srcdir\" : 'X\\(.*[^/]\\)' \\| \"X$srcdir\" : 'X\\(.*\\)'`;;\r\nesac\r\nfor ac_var in $ac_precious_vars; do\r\n eval ac_env_${ac_var}_set=\\${${ac_var}+set}\r\n eval ac_env_${ac_var}_value=\\$${ac_var}\r\n eval ac_cv_env_${ac_var}_set=\\${${ac_var}+set}\r\n eval ac_cv_env_${ac_var}_value=\\$${ac_var}\r\ndone\r\n\r\n#\r\n# Report the --help message.\r\n#\r\nif test \"$ac_init_help\" = \"long\"; then\r\n # Omit some internal or obsolete options to make the list less imposing.\r\n # This message is too long to be a string in the A/UX 3.1 sh.\r\n cat <<_ACEOF\r\n\\`configure' configures Snes9x 1.60 to adapt to many kinds of systems.\r\n\r\nUsage: $0 [OPTION]... [VAR=VALUE]...\r\n\r\nTo assign environment variables (e.g., CC, CFLAGS...), specify them as\r\nVAR=VALUE. See below for descriptions of some of the useful variables.\r\n\r\nDefaults for the options are specified in brackets.\r\n\r\nConfiguration:\r\n -h, --help display this help and exit\r\n --help=short display options specific to this package\r\n --help=recursive display the short help of all the included packages\r\n -V, --version display version information and exit\r\n -q, --quiet, --silent do not print \\`checking ...' messages\r\n --cache-file=FILE cache test results in FILE [disabled]\r\n -C, --config-cache alias for \\`--cache-file=config.cache'\r\n -n, --no-create do not create output files\r\n --srcdir=DIR find the sources in DIR [configure dir or \\`..']\r\n\r\nInstallation directories:\r\n --prefix=PREFIX install architecture-independent files in PREFIX\r\n [$ac_default_prefix]\r\n --exec-prefix=EPREFIX install architecture-dependent files in EPREFIX\r\n [PREFIX]\r\n\r\nBy default, \\`make install' will install all the files in\r\n\\`$ac_default_prefix/bin', \\`$ac_default_prefix/lib' etc. You can specify\r\nan installation prefix other than \\`$ac_default_prefix' using \\`--prefix',\r\nfor instance \\`--prefix=\\$HOME'.\r\n\r\nFor better control, use the options below.\r\n\r\nFine tuning of the installation directories:\r\n --bindir=DIR user executables [EPREFIX/bin]\r\n --sbindir=DIR system admin executables [EPREFIX/sbin]\r\n --libexecdir=DIR program executables [EPREFIX/libexec]\r\n --sysconfdir=DIR read-only single-machine data [PREFIX/etc]\r\n --sharedstatedir=DIR modifiable architecture-independent data [PREFIX/com]\r\n --localstatedir=DIR modifiable single-machine data [PREFIX/var]\r\n --libdir=DIR object code libraries [EPREFIX/lib]\r\n --includedir=DIR C header files [PREFIX/include]\r\n --oldincludedir=DIR C header files for non-gcc [/usr/include]\r\n --datarootdir=DIR read-only arch.-independent data root [PREFIX/share]\r\n --datadir=DIR read-only architecture-independent data [DATAROOTDIR]\r\n --infodir=DIR info documentation [DATAROOTDIR/info]\r\n --localedir=DIR locale-dependent data [DATAROOTDIR/locale]\r\n --mandir=DIR man documentation [DATAROOTDIR/man]\r\n --docdir=DIR documentation root [DATAROOTDIR/doc/snes9x]\r\n --htmldir=DIR html documentation [DOCDIR]\r\n --dvidir=DIR dvi documentation [DOCDIR]\r\n --pdfdir=DIR pdf documentation [DOCDIR]\r\n --psdir=DIR ps documentation [DOCDIR]\r\n_ACEOF\r\n\r\n cat <<\\_ACEOF\r\n\r\nX features:\r\n --x-includes=DIR X include files are in DIR\r\n --x-libraries=DIR X library files are in DIR\r\n\r\nSystem types:\r\n --build=BUILD configure for building on BUILD [guessed]\r\n --host=HOST cross-compile to build programs to run on HOST [BUILD]\r\n --target=TARGET configure for building compilers for TARGET [HOST]\r\n_ACEOF\r\nfi\r\n\r\nif test -n \"$ac_init_help\"; then\r\n case $ac_init_help in\r\n short | recursive ) echo \"Configuration of Snes9x 1.60:\";;\r\n esac\r\n cat <<\\_ACEOF\r\n\r\nOptional Features:\r\n --disable-option-checking ignore unrecognized --enable/--with options\r\n --disable-FEATURE do not include FEATURE (same as --enable-FEATURE=no)\r\n --enable-FEATURE[=ARG] include FEATURE [ARG=yes]\r\n --enable-debug leave debug information in the final binary\r\n (default: no)\r\n --enable-mtune use the specified value for the -mtune/-mcpu flag\r\n (default: no)\r\n --enable-sse41 enable SSE4.1 if available (default: no)\r\n --enable-avx2 enable AVX2 if available (default: no)\r\n --enable-neon enable NEON if available (default: no)\r\n --enable-gamepad enable gamepad support if available (default: yes)\r\n --enable-debugger enable debugger (default: no)\r\n --enable-netplay enable netplay support (default: no)\r\n --enable-gzip enable GZIP support through zlib (default: yes)\r\n --enable-zip enable ZIP support through zlib (default: yes)\r\n --enable-jma enable JMA support (default: yes)\r\n --enable-screenshot enable screenshot support through libpng (default:\r\n yes)\r\n --enable-xvideo enable Xvideo if available (default: yes)\r\n --enable-xinerama enable Xinerama if available (default: yes)\r\n --enable-sound enable sound if available (default: yes)\r\n\r\nOptional Packages:\r\n --with-PACKAGE[=ARG] use PACKAGE [ARG=yes]\r\n --without-PACKAGE do not use PACKAGE (same as --with-PACKAGE=no)\r\n --with-system-zip Use system zip (default: check)\r\n --with-x use the X Window System\r\n\r\nSome influential environment variables:\r\n CC C compiler command\r\n CFLAGS C compiler flags\r\n LDFLAGS linker flags, e.g. -L<lib dir> if you have libraries in a\r\n nonstandard directory <lib dir>\r\n LIBS libraries to pass to the linker, e.g. -l<library>\r\n CPPFLAGS (Objective) C/C++ preprocessor flags, e.g. -I<include dir> if\r\n you have headers in a nonstandard directory <include dir>\r\n CXX C++ compiler command\r\n CXXFLAGS C++ compiler flags\r\n CXXCPP C++ preprocessor\r\n PKG_CONFIG path to pkg-config utility\r\n PKG_CONFIG_PATH\r\n directories to add to pkg-config's search path\r\n PKG_CONFIG_LIBDIR\r\n path overriding pkg-config's built-in search path\r\n SYSTEM_ZIP_CFLAGS\r\n C compiler flags for SYSTEM_ZIP, overriding pkg-config\r\n SYSTEM_ZIP_LIBS\r\n linker flags for SYSTEM_ZIP, overriding pkg-config\r\n XMKMF Path to xmkmf, Makefile generator for X Window System\r\n\r\nUse these variables to override the choices made by `configure' or to help\r\nit to find libraries and programs with nonstandard names/locations.\r\n\r\nReport bugs to the package provider.\r\n_ACEOF\r\nac_status=$?\r\nfi\r\n\r\nif test \"$ac_init_help\" = \"recursive\"; then\r\n # If there are subdirs, report their specific --help.\r\n for ac_dir in : $ac_subdirs_all; do test \"x$ac_dir\" = x: && continue\r\n test -d \"$ac_dir\" ||\r\n { cd \"$srcdir\" && ac_pwd=`pwd` && srcdir=. && test -d \"$ac_dir\"; } ||\r\n continue\r\n ac_builddir=.\r\n\r\ncase \"$ac_dir\" in\r\n.) ac_dir_suffix= ac_top_builddir_sub=. ac_top_build_prefix= ;;\r\n*)\r\n ac_dir_suffix=/`$as_echo \"$ac_dir\" | sed 's|^\\.[\\\\/]||'`\r\n # A \"..\" for each directory in $ac_dir_suffix.\r\n ac_top_builddir_sub=`$as_echo \"$ac_dir_suffix\" | sed 's|/[^\\\\/]*|/..|g;s|/||'`\r\n case $ac_top_builddir_sub in\r\n \"\") ac_top_builddir_sub=. ac_top_build_prefix= ;;\r\n *) ac_top_build_prefix=$ac_top_builddir_sub/ ;;\r\n esac ;;\r\nesac\r\nac_abs_top_builddir=$ac_pwd\r\nac_abs_builddir=$ac_pwd$ac_dir_suffix\r\n# for backward compatibility:\r\nac_top_builddir=$ac_top_build_prefix\r\n\r\ncase $srcdir in\r\n .) # We are building in place.\r\n ac_srcdir=.\r\n ac_top_srcdir=$ac_top_builddir_sub\r\n ac_abs_top_srcdir=$ac_pwd ;;\r\n [\\\\/]* | ?:[\\\\/]* ) # Absolute name.\r\n ac_srcdir=$srcdir$ac_dir_suffix;\r\n ac_top_srcdir=$srcdir\r\n ac_abs_top_srcdir=$srcdir ;;\r\n *) # Relative name.\r\n ac_srcdir=$ac_top_build_prefix$srcdir$ac_dir_suffix\r\n ac_top_srcdir=$ac_top_build_prefix$srcdir\r\n ac_abs_top_srcdir=$ac_pwd/$srcdir ;;\r\nesac\r\nac_abs_srcdir=$ac_abs_top_srcdir$ac_dir_suffix\r\n\r\n cd \"$ac_dir\" || { ac_status=$?; continue; }\r\n # Check for guested configure.\r\n if test -f \"$ac_srcdir/configure.gnu\"; then\r\n echo &&\r\n $SHELL \"$ac_srcdir/configure.gnu\" --help=recursive\r\n elif test -f \"$ac_srcdir/configure\"; then\r\n echo &&\r\n $SHELL \"$ac_srcdir/configure\" --help=recursive\r\n else\r\n $as_echo \"$as_me: WARNING: no configuration information is in $ac_dir\" >&2\r\n fi || ac_status=$?\r\n cd \"$ac_pwd\" || { ac_status=$?; break; }\r\n done\r\nfi\r\n\r\ntest -n \"$ac_init_help\" && exit $ac_status\r\nif $ac_init_version; then\r\n cat <<\\_ACEOF\r\nSnes9x configure 1.60\r\ngenerated by GNU Autoconf 2.69\r\n\r\nCopyright (C) 2012 Free Software Foundation, Inc.\r\nThis configure script is free software; the Free Software Foundation\r\ngives unlimited permission to copy, distribute and modify it.\r\n_ACEOF\r\n exit\r\nfi\r\n\r\n## ------------------------ ##\r\n## Autoconf initialization. ##\r\n## ------------------------ ##\r\n\r\n# ac_fn_c_try_compile LINENO\r\n# --------------------------\r\n# Try to compile conftest.$ac_ext, and return whether this succeeded.\r\nac_fn_c_try_compile ()\r\n{\r\n as_lineno=${as_lineno-\"$1\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n rm -f conftest.$ac_objext\r\n if { { ac_try=\"$ac_compile\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_compile\") 2>conftest.err\r\n ac_status=$?\r\n if test -s conftest.err; then\r\n grep -v '^ *+' conftest.err >conftest.er1\r\n cat conftest.er1 >&5\r\n mv -f conftest.er1 conftest.err\r\n fi\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; } && {\r\n\t test -z \"$ac_c_werror_flag\" ||\r\n\t test ! -s conftest.err\r\n } && test -s conftest.$ac_objext; then :\r\n ac_retval=0\r\nelse\r\n $as_echo \"$as_me: failed program was:\" >&5\r\nsed 's/^/| /' conftest.$ac_ext >&5\r\n\r\n\tac_retval=1\r\nfi\r\n eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno\r\n as_fn_set_status $ac_retval\r\n\r\n} # ac_fn_c_try_compile\r\n\r\n# ac_fn_cxx_try_compile LINENO\r\n# ----------------------------\r\n# Try to compile conftest.$ac_ext, and return whether this succeeded.\r\nac_fn_cxx_try_compile ()\r\n{\r\n as_lineno=${as_lineno-\"$1\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n rm -f conftest.$ac_objext\r\n if { { ac_try=\"$ac_compile\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_compile\") 2>conftest.err\r\n ac_status=$?\r\n if test -s conftest.err; then\r\n grep -v '^ *+' conftest.err >conftest.er1\r\n cat conftest.er1 >&5\r\n mv -f conftest.er1 conftest.err\r\n fi\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; } && {\r\n\t test -z \"$ac_cxx_werror_flag\" ||\r\n\t test ! -s conftest.err\r\n } && test -s conftest.$ac_objext; then :\r\n ac_retval=0\r\nelse\r\n $as_echo \"$as_me: failed program was:\" >&5\r\nsed 's/^/| /' conftest.$ac_ext >&5\r\n\r\n\tac_retval=1\r\nfi\r\n eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno\r\n as_fn_set_status $ac_retval\r\n\r\n} # ac_fn_cxx_try_compile\r\n\r\n# ac_fn_cxx_try_run LINENO\r\n# ------------------------\r\n# Try to link conftest.$ac_ext, and return whether this succeeded. Assumes\r\n# that executables *can* be run.\r\nac_fn_cxx_try_run ()\r\n{\r\n as_lineno=${as_lineno-\"$1\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n if { { ac_try=\"$ac_link\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_link\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; } && { ac_try='./conftest$ac_exeext'\r\n { { case \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_try\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }; }; then :\r\n ac_retval=0\r\nelse\r\n $as_echo \"$as_me: program exited with status $ac_status\" >&5\r\n $as_echo \"$as_me: failed program was:\" >&5\r\nsed 's/^/| /' conftest.$ac_ext >&5\r\n\r\n ac_retval=$ac_status\r\nfi\r\n rm -rf conftest.dSYM conftest_ipa8_conftest.oo\r\n eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno\r\n as_fn_set_status $ac_retval\r\n\r\n} # ac_fn_cxx_try_run\r\n\r\n# ac_fn_cxx_try_cpp LINENO\r\n# ------------------------\r\n# Try to preprocess conftest.$ac_ext, and return whether this succeeded.\r\nac_fn_cxx_try_cpp ()\r\n{\r\n as_lineno=${as_lineno-\"$1\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n if { { ac_try=\"$ac_cpp conftest.$ac_ext\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_cpp conftest.$ac_ext\") 2>conftest.err\r\n ac_status=$?\r\n if test -s conftest.err; then\r\n grep -v '^ *+' conftest.err >conftest.er1\r\n cat conftest.er1 >&5\r\n mv -f conftest.er1 conftest.err\r\n fi\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; } > conftest.i && {\r\n\t test -z \"$ac_cxx_preproc_warn_flag$ac_cxx_werror_flag\" ||\r\n\t test ! -s conftest.err\r\n }; then :\r\n ac_retval=0\r\nelse\r\n $as_echo \"$as_me: failed program was:\" >&5\r\nsed 's/^/| /' conftest.$ac_ext >&5\r\n\r\n ac_retval=1\r\nfi\r\n eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno\r\n as_fn_set_status $ac_retval\r\n\r\n} # ac_fn_cxx_try_cpp\r\n\r\n# ac_fn_cxx_check_header_mongrel LINENO HEADER VAR INCLUDES\r\n# ---------------------------------------------------------\r\n# Tests whether HEADER exists, giving a warning if it cannot be compiled using\r\n# the include files in INCLUDES and setting the cache variable VAR\r\n# accordingly.\r\nac_fn_cxx_check_header_mongrel ()\r\n{\r\n as_lineno=${as_lineno-\"$1\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n if eval \\${$3+:} false; then :\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $2\" >&5\r\n$as_echo_n \"checking for $2... \" >&6; }\r\nif eval \\${$3+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nfi\r\neval ac_res=\\$$3\r\n\t { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_res\" >&5\r\n$as_echo \"$ac_res\" >&6; }\r\nelse\r\n # Is the header compilable?\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking $2 usability\" >&5\r\n$as_echo_n \"checking $2 usability... \" >&6; }\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n$4\r\n#include <$2>\r\n_ACEOF\r\nif ac_fn_cxx_try_compile \"$LINENO\"; then :\r\n ac_header_compiler=yes\r\nelse\r\n ac_header_compiler=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_header_compiler\" >&5\r\n$as_echo \"$ac_header_compiler\" >&6; }\r\n\r\n# Is the header present?\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking $2 presence\" >&5\r\n$as_echo_n \"checking $2 presence... \" >&6; }\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <$2>\r\n_ACEOF\r\nif ac_fn_cxx_try_cpp \"$LINENO\"; then :\r\n ac_header_preproc=yes\r\nelse\r\n ac_header_preproc=no\r\nfi\r\nrm -f conftest.err conftest.i conftest.$ac_ext\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_header_preproc\" >&5\r\n$as_echo \"$ac_header_preproc\" >&6; }\r\n\r\n# So? What about this header?\r\ncase $ac_header_compiler:$ac_header_preproc:$ac_cxx_preproc_warn_flag in #((\r\n yes:no: )\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $2: accepted by the compiler, rejected by the preprocessor!\" >&5\r\n$as_echo \"$as_me: WARNING: $2: accepted by the compiler, rejected by the preprocessor!\" >&2;}\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $2: proceeding with the compiler's result\" >&5\r\n$as_echo \"$as_me: WARNING: $2: proceeding with the compiler's result\" >&2;}\r\n ;;\r\n no:yes:* )\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $2: present but cannot be compiled\" >&5\r\n$as_echo \"$as_me: WARNING: $2: present but cannot be compiled\" >&2;}\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $2: check for missing prerequisite headers?\" >&5\r\n$as_echo \"$as_me: WARNING: $2: check for missing prerequisite headers?\" >&2;}\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $2: see the Autoconf documentation\" >&5\r\n$as_echo \"$as_me: WARNING: $2: see the Autoconf documentation\" >&2;}\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $2: section \\\"Present But Cannot Be Compiled\\\"\" >&5\r\n$as_echo \"$as_me: WARNING: $2: section \\\"Present But Cannot Be Compiled\\\"\" >&2;}\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $2: proceeding with the compiler's result\" >&5\r\n$as_echo \"$as_me: WARNING: $2: proceeding with the compiler's result\" >&2;}\r\n ;;\r\nesac\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $2\" >&5\r\n$as_echo_n \"checking for $2... \" >&6; }\r\nif eval \\${$3+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n eval \"$3=\\$ac_header_compiler\"\r\nfi\r\neval ac_res=\\$$3\r\n\t { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_res\" >&5\r\n$as_echo \"$ac_res\" >&6; }\r\nfi\r\n eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno\r\n\r\n} # ac_fn_cxx_check_header_mongrel\r\n\r\n# ac_fn_cxx_check_header_compile LINENO HEADER VAR INCLUDES\r\n# ---------------------------------------------------------\r\n# Tests whether HEADER exists and can be compiled using the include files in\r\n# INCLUDES, setting the cache variable VAR accordingly.\r\nac_fn_cxx_check_header_compile ()\r\n{\r\n as_lineno=${as_lineno-\"$1\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $2\" >&5\r\n$as_echo_n \"checking for $2... \" >&6; }\r\nif eval \\${$3+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n$4\r\n#include <$2>\r\n_ACEOF\r\nif ac_fn_cxx_try_compile \"$LINENO\"; then :\r\n eval \"$3=yes\"\r\nelse\r\n eval \"$3=no\"\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\nfi\r\neval ac_res=\\$$3\r\n\t { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_res\" >&5\r\n$as_echo \"$ac_res\" >&6; }\r\n eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno\r\n\r\n} # ac_fn_cxx_check_header_compile\r\n\r\n# ac_fn_cxx_try_link LINENO\r\n# -------------------------\r\n# Try to link conftest.$ac_ext, and return whether this succeeded.\r\nac_fn_cxx_try_link ()\r\n{\r\n as_lineno=${as_lineno-\"$1\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n rm -f conftest.$ac_objext conftest$ac_exeext\r\n if { { ac_try=\"$ac_link\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_link\") 2>conftest.err\r\n ac_status=$?\r\n if test -s conftest.err; then\r\n grep -v '^ *+' conftest.err >conftest.er1\r\n cat conftest.er1 >&5\r\n mv -f conftest.er1 conftest.err\r\n fi\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; } && {\r\n\t test -z \"$ac_cxx_werror_flag\" ||\r\n\t test ! -s conftest.err\r\n } && test -s conftest$ac_exeext && {\r\n\t test \"$cross_compiling\" = yes ||\r\n\t test -x conftest$ac_exeext\r\n }; then :\r\n ac_retval=0\r\nelse\r\n $as_echo \"$as_me: failed program was:\" >&5\r\nsed 's/^/| /' conftest.$ac_ext >&5\r\n\r\n\tac_retval=1\r\nfi\r\n # Delete the IPA/IPO (Inter Procedural Analysis/Optimization) information\r\n # created by the PGI compiler (conftest_ipa8_conftest.oo), as it would\r\n # interfere with the next link command; also delete a directory that is\r\n # left behind by Apple's compiler. We do this before executing the actions.\r\n rm -rf conftest.dSYM conftest_ipa8_conftest.oo\r\n eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno\r\n as_fn_set_status $ac_retval\r\n\r\n} # ac_fn_cxx_try_link\r\n\r\n# ac_fn_cxx_check_func LINENO FUNC VAR\r\n# ------------------------------------\r\n# Tests whether FUNC exists, setting the cache variable VAR accordingly\r\nac_fn_cxx_check_func ()\r\n{\r\n as_lineno=${as_lineno-\"$1\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $2\" >&5\r\n$as_echo_n \"checking for $2... \" >&6; }\r\nif eval \\${$3+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n/* Define $2 to an innocuous variant, in case <limits.h> declares $2.\r\n For example, HP-UX 11i <limits.h> declares gettimeofday. */\r\n#define $2 innocuous_$2\r\n\r\n/* System header to define __stub macros and hopefully few prototypes,\r\n which can conflict with char $2 (); below.\r\n Prefer <limits.h> to <assert.h> if __STDC__ is defined, since\r\n <limits.h> exists even on freestanding compilers. */\r\n\r\n#ifdef __STDC__\r\n# include <limits.h>\r\n#else\r\n# include <assert.h>\r\n#endif\r\n\r\n#undef $2\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar $2 ();\r\n/* The GNU C library defines this for functions which it implements\r\n to always fail with ENOSYS. Some functions are actually named\r\n something starting with __ and the normal name is an alias. */\r\n#if defined __stub_$2 || defined __stub___$2\r\nchoke me\r\n#endif\r\n\r\nint\r\nmain ()\r\n{\r\nreturn $2 ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n eval \"$3=yes\"\r\nelse\r\n eval \"$3=no\"\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nfi\r\neval ac_res=\\$$3\r\n\t { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_res\" >&5\r\n$as_echo \"$ac_res\" >&6; }\r\n eval $as_lineno_stack; ${as_lineno_stack:+:} unset as_lineno\r\n\r\n} # ac_fn_cxx_check_func\r\ncat >config.log <<_ACEOF\r\nThis file contains any messages produced by compilers while\r\nrunning configure, to aid debugging if configure makes a mistake.\r\n\r\nIt was created by Snes9x $as_me 1.60, which was\r\ngenerated by GNU Autoconf 2.69. Invocation command line was\r\n\r\n $ $0 $@\r\n\r\n_ACEOF\r\nexec 5>>config.log\r\n{\r\ncat <<_ASUNAME\r\n## --------- ##\r\n## Platform. ##\r\n## --------- ##\r\n\r\nhostname = `(hostname || uname -n) 2>/dev/null | sed 1q`\r\nuname -m = `(uname -m) 2>/dev/null || echo unknown`\r\nuname -r = `(uname -r) 2>/dev/null || echo unknown`\r\nuname -s = `(uname -s) 2>/dev/null || echo unknown`\r\nuname -v = `(uname -v) 2>/dev/null || echo unknown`\r\n\r\n/usr/bin/uname -p = `(/usr/bin/uname -p) 2>/dev/null || echo unknown`\r\n/bin/uname -X = `(/bin/uname -X) 2>/dev/null || echo unknown`\r\n\r\n/bin/arch = `(/bin/arch) 2>/dev/null || echo unknown`\r\n/usr/bin/arch -k = `(/usr/bin/arch -k) 2>/dev/null || echo unknown`\r\n/usr/convex/getsysinfo = `(/usr/convex/getsysinfo) 2>/dev/null || echo unknown`\r\n/usr/bin/hostinfo = `(/usr/bin/hostinfo) 2>/dev/null || echo unknown`\r\n/bin/machine = `(/bin/machine) 2>/dev/null || echo unknown`\r\n/usr/bin/oslevel = `(/usr/bin/oslevel) 2>/dev/null || echo unknown`\r\n/bin/universe = `(/bin/universe) 2>/dev/null || echo unknown`\r\n\r\n_ASUNAME\r\n\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n $as_echo \"PATH: $as_dir\"\r\n done\r\nIFS=$as_save_IFS\r\n\r\n} >&5\r\n\r\ncat >&5 <<_ACEOF\r\n\r\n\r\n## ----------- ##\r\n## Core tests. ##\r\n## ----------- ##\r\n\r\n_ACEOF\r\n\r\n\r\n# Keep a trace of the command line.\r\n# Strip out --no-create and --no-recursion so they do not pile up.\r\n# Strip out --silent because we don't want to record it for future runs.\r\n# Also quote any args containing shell meta-characters.\r\n# Make two passes to allow for proper duplicate-argument suppression.\r\nac_configure_args=\r\nac_configure_args0=\r\nac_configure_args1=\r\nac_must_keep_next=false\r\nfor ac_pass in 1 2\r\ndo\r\n for ac_arg\r\n do\r\n case $ac_arg in\r\n -no-create | --no-c* | -n | -no-recursion | --no-r*) continue ;;\r\n -q | -quiet | --quiet | --quie | --qui | --qu | --q \\\r\n | -silent | --silent | --silen | --sile | --sil)\r\n continue ;;\r\n *\\'*)\r\n ac_arg=`$as_echo \"$ac_arg\" | sed \"s/'/'\\\\\\\\\\\\\\\\''/g\"` ;;\r\n esac\r\n case $ac_pass in\r\n 1) as_fn_append ac_configure_args0 \" '$ac_arg'\" ;;\r\n 2)\r\n as_fn_append ac_configure_args1 \" '$ac_arg'\"\r\n if test $ac_must_keep_next = true; then\r\n\tac_must_keep_next=false # Got value, back to normal.\r\n else\r\n\tcase $ac_arg in\r\n\t *=* | --config-cache | -C | -disable-* | --disable-* \\\r\n\t | -enable-* | --enable-* | -gas | --g* | -nfp | --nf* \\\r\n\t | -q | -quiet | --q* | -silent | --sil* | -v | -verb* \\\r\n\t | -with-* | --with-* | -without-* | --without-* | --x)\r\n\t case \"$ac_configure_args0 \" in\r\n\t \"$ac_configure_args1\"*\" '$ac_arg' \"* ) continue ;;\r\n\t esac\r\n\t ;;\r\n\t -* ) ac_must_keep_next=true ;;\r\n\tesac\r\n fi\r\n as_fn_append ac_configure_args \" '$ac_arg'\"\r\n ;;\r\n esac\r\n done\r\ndone\r\n{ ac_configure_args0=; unset ac_configure_args0;}\r\n{ ac_configure_args1=; unset ac_configure_args1;}\r\n\r\n# When interrupted or exit'd, cleanup temporary files, and complete\r\n# config.log. We remove comments because anyway the quotes in there\r\n# would cause problems or look ugly.\r\n# WARNING: Use '\\'' to represent an apostrophe within the trap.\r\n# WARNING: Do not start the trap code with a newline, due to a FreeBSD 4.0 bug.\r\ntrap 'exit_status=$?\r\n # Save into config.log some information that might help in debugging.\r\n {\r\n echo\r\n\r\n $as_echo \"## ---------------- ##\r\n## Cache variables. ##\r\n## ---------------- ##\"\r\n echo\r\n # The following way of writing the cache mishandles newlines in values,\r\n(\r\n for ac_var in `(set) 2>&1 | sed -n '\\''s/^\\([a-zA-Z_][a-zA-Z0-9_]*\\)=.*/\\1/p'\\''`; do\r\n eval ac_val=\\$$ac_var\r\n case $ac_val in #(\r\n *${as_nl}*)\r\n case $ac_var in #(\r\n *_cv_*) { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: cache variable $ac_var contains a newline\" >&5\r\n$as_echo \"$as_me: WARNING: cache variable $ac_var contains a newline\" >&2;} ;;\r\n esac\r\n case $ac_var in #(\r\n _ | IFS | as_nl) ;; #(\r\n BASH_ARGV | BASH_SOURCE) eval $ac_var= ;; #(\r\n *) { eval $ac_var=; unset $ac_var;} ;;\r\n esac ;;\r\n esac\r\n done\r\n (set) 2>&1 |\r\n case $as_nl`(ac_space='\\'' '\\''; set) 2>&1` in #(\r\n *${as_nl}ac_space=\\ *)\r\n sed -n \\\r\n\t\"s/'\\''/'\\''\\\\\\\\'\\'''\\''/g;\r\n\t s/^\\\\([_$as_cr_alnum]*_cv_[_$as_cr_alnum]*\\\\)=\\\\(.*\\\\)/\\\\1='\\''\\\\2'\\''/p\"\r\n ;; #(\r\n *)\r\n sed -n \"/^[_$as_cr_alnum]*_cv_[_$as_cr_alnum]*=/p\"\r\n ;;\r\n esac |\r\n sort\r\n)\r\n echo\r\n\r\n $as_echo \"## ----------------- ##\r\n## Output variables. ##\r\n## ----------------- ##\"\r\n echo\r\n for ac_var in $ac_subst_vars\r\n do\r\n eval ac_val=\\$$ac_var\r\n case $ac_val in\r\n *\\'\\''*) ac_val=`$as_echo \"$ac_val\" | sed \"s/'\\''/'\\''\\\\\\\\\\\\\\\\'\\'''\\''/g\"`;;\r\n esac\r\n $as_echo \"$ac_var='\\''$ac_val'\\''\"\r\n done | sort\r\n echo\r\n\r\n if test -n \"$ac_subst_files\"; then\r\n $as_echo \"## ------------------- ##\r\n## File substitutions. ##\r\n## ------------------- ##\"\r\n echo\r\n for ac_var in $ac_subst_files\r\n do\r\n\teval ac_val=\\$$ac_var\r\n\tcase $ac_val in\r\n\t*\\'\\''*) ac_val=`$as_echo \"$ac_val\" | sed \"s/'\\''/'\\''\\\\\\\\\\\\\\\\'\\'''\\''/g\"`;;\r\n\tesac\r\n\t$as_echo \"$ac_var='\\''$ac_val'\\''\"\r\n done | sort\r\n echo\r\n fi\r\n\r\n if test -s confdefs.h; then\r\n $as_echo \"## ----------- ##\r\n## confdefs.h. ##\r\n## ----------- ##\"\r\n echo\r\n cat confdefs.h\r\n echo\r\n fi\r\n test \"$ac_signal\" != 0 &&\r\n $as_echo \"$as_me: caught signal $ac_signal\"\r\n $as_echo \"$as_me: exit $exit_status\"\r\n } >&5\r\n rm -f core *.core core.conftest.* &&\r\n rm -f -r conftest* confdefs* conf$$* $ac_clean_files &&\r\n exit $exit_status\r\n' 0\r\nfor ac_signal in 1 2 13 15; do\r\n trap 'ac_signal='$ac_signal'; as_fn_exit 1' $ac_signal\r\ndone\r\nac_signal=0\r\n\r\n# confdefs.h avoids OS command line length limits that DEFS can exceed.\r\nrm -f -r conftest* confdefs.h\r\n\r\n$as_echo \"/* confdefs.h */\" > confdefs.h\r\n\r\n# Predefined preprocessor variables.\r\n\r\ncat >>confdefs.h <<_ACEOF\r\n#define PACKAGE_NAME \"$PACKAGE_NAME\"\r\n_ACEOF\r\n\r\ncat >>confdefs.h <<_ACEOF\r\n#define PACKAGE_TARNAME \"$PACKAGE_TARNAME\"\r\n_ACEOF\r\n\r\ncat >>confdefs.h <<_ACEOF\r\n#define PACKAGE_VERSION \"$PACKAGE_VERSION\"\r\n_ACEOF\r\n\r\ncat >>confdefs.h <<_ACEOF\r\n#define PACKAGE_STRING \"$PACKAGE_STRING\"\r\n_ACEOF\r\n\r\ncat >>confdefs.h <<_ACEOF\r\n#define PACKAGE_BUGREPORT \"$PACKAGE_BUGREPORT\"\r\n_ACEOF\r\n\r\ncat >>confdefs.h <<_ACEOF\r\n#define PACKAGE_URL \"$PACKAGE_URL\"\r\n_ACEOF\r\n\r\n\r\n# Let the site file select an alternate cache file if it wants to.\r\n# Prefer an explicitly selected file to automatically selected ones.\r\nac_site_file1=NONE\r\nac_site_file2=NONE\r\nif test -n \"$CONFIG_SITE\"; then\r\n # We do not want a PATH search for config.site.\r\n case $CONFIG_SITE in #((\r\n -*) ac_site_file1=./$CONFIG_SITE;;\r\n */*) ac_site_file1=$CONFIG_SITE;;\r\n *) ac_site_file1=./$CONFIG_SITE;;\r\n esac\r\nelif test \"x$prefix\" != xNONE; then\r\n ac_site_file1=$prefix/share/config.site\r\n ac_site_file2=$prefix/etc/config.site\r\nelse\r\n ac_site_file1=$ac_default_prefix/share/config.site\r\n ac_site_file2=$ac_default_prefix/etc/config.site\r\nfi\r\nfor ac_site_file in \"$ac_site_file1\" \"$ac_site_file2\"\r\ndo\r\n test \"x$ac_site_file\" = xNONE && continue\r\n if test /dev/null != \"$ac_site_file\" && test -r \"$ac_site_file\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: loading site script $ac_site_file\" >&5\r\n$as_echo \"$as_me: loading site script $ac_site_file\" >&6;}\r\n sed 's/^/| /' \"$ac_site_file\" >&5\r\n . \"$ac_site_file\" \\\r\n || { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"failed to load site script $ac_site_file\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\n fi\r\ndone\r\n\r\nif test -r \"$cache_file\"; then\r\n # Some versions of bash will fail to source /dev/null (special files\r\n # actually), so we avoid doing that. DJGPP emulates it as a regular file.\r\n if test /dev/null != \"$cache_file\" && test -f \"$cache_file\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: loading cache $cache_file\" >&5\r\n$as_echo \"$as_me: loading cache $cache_file\" >&6;}\r\n case $cache_file in\r\n [\\\\/]* | ?:[\\\\/]* ) . \"$cache_file\";;\r\n *) . \"./$cache_file\";;\r\n esac\r\n fi\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: creating cache $cache_file\" >&5\r\n$as_echo \"$as_me: creating cache $cache_file\" >&6;}\r\n >$cache_file\r\nfi\r\n\r\n# Check that the precious variables saved in the cache have kept the same\r\n# value.\r\nac_cache_corrupted=false\r\nfor ac_var in $ac_precious_vars; do\r\n eval ac_old_set=\\$ac_cv_env_${ac_var}_set\r\n eval ac_new_set=\\$ac_env_${ac_var}_set\r\n eval ac_old_val=\\$ac_cv_env_${ac_var}_value\r\n eval ac_new_val=\\$ac_env_${ac_var}_value\r\n case $ac_old_set,$ac_new_set in\r\n set,)\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: error: \\`$ac_var' was set to \\`$ac_old_val' in the previous run\" >&5\r\n$as_echo \"$as_me: error: \\`$ac_var' was set to \\`$ac_old_val' in the previous run\" >&2;}\r\n ac_cache_corrupted=: ;;\r\n ,set)\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: error: \\`$ac_var' was not set in the previous run\" >&5\r\n$as_echo \"$as_me: error: \\`$ac_var' was not set in the previous run\" >&2;}\r\n ac_cache_corrupted=: ;;\r\n ,);;\r\n *)\r\n if test \"x$ac_old_val\" != \"x$ac_new_val\"; then\r\n\t# differences in whitespace do not lead to failure.\r\n\tac_old_val_w=`echo x $ac_old_val`\r\n\tac_new_val_w=`echo x $ac_new_val`\r\n\tif test \"$ac_old_val_w\" != \"$ac_new_val_w\"; then\r\n\t { $as_echo \"$as_me:${as_lineno-$LINENO}: error: \\`$ac_var' has changed since the previous run:\" >&5\r\n$as_echo \"$as_me: error: \\`$ac_var' has changed since the previous run:\" >&2;}\r\n\t ac_cache_corrupted=:\r\n\telse\r\n\t { $as_echo \"$as_me:${as_lineno-$LINENO}: warning: ignoring whitespace changes in \\`$ac_var' since the previous run:\" >&5\r\n$as_echo \"$as_me: warning: ignoring whitespace changes in \\`$ac_var' since the previous run:\" >&2;}\r\n\t eval $ac_var=\\$ac_old_val\r\n\tfi\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: former value: \\`$ac_old_val'\" >&5\r\n$as_echo \"$as_me: former value: \\`$ac_old_val'\" >&2;}\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: current value: \\`$ac_new_val'\" >&5\r\n$as_echo \"$as_me: current value: \\`$ac_new_val'\" >&2;}\r\n fi;;\r\n esac\r\n # Pass precious variables to config.status.\r\n if test \"$ac_new_set\" = set; then\r\n case $ac_new_val in\r\n *\\'*) ac_arg=$ac_var=`$as_echo \"$ac_new_val\" | sed \"s/'/'\\\\\\\\\\\\\\\\''/g\"` ;;\r\n *) ac_arg=$ac_var=$ac_new_val ;;\r\n esac\r\n case \" $ac_configure_args \" in\r\n *\" '$ac_arg' \"*) ;; # Avoid dups. Use of quotes ensures accuracy.\r\n *) as_fn_append ac_configure_args \" '$ac_arg'\" ;;\r\n esac\r\n fi\r\ndone\r\nif $ac_cache_corrupted; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: error: changes in the environment can compromise the build\" >&5\r\n$as_echo \"$as_me: error: changes in the environment can compromise the build\" >&2;}\r\n as_fn_error $? \"run \\`make distclean' and/or \\`rm $cache_file' and start over\" \"$LINENO\" 5\r\nfi\r\n## -------------------- ##\r\n## Main body of script. ##\r\n## -------------------- ##\r\n\r\nac_ext=c\r\nac_cpp='$CPP $CPPFLAGS'\r\nac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5'\r\nac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'\r\nac_compiler_gnu=$ac_cv_c_compiler_gnu\r\n\r\n\r\n\r\n\r\n\r\n\r\nac_aux_dir=\r\nfor ac_dir in \"$srcdir\" \"$srcdir/..\" \"$srcdir/../..\"; do\r\n if test -f \"$ac_dir/install-sh\"; then\r\n ac_aux_dir=$ac_dir\r\n ac_install_sh=\"$ac_aux_dir/install-sh -c\"\r\n break\r\n elif test -f \"$ac_dir/install.sh\"; then\r\n ac_aux_dir=$ac_dir\r\n ac_install_sh=\"$ac_aux_dir/install.sh -c\"\r\n break\r\n elif test -f \"$ac_dir/shtool\"; then\r\n ac_aux_dir=$ac_dir\r\n ac_install_sh=\"$ac_aux_dir/shtool install -c\"\r\n break\r\n fi\r\ndone\r\nif test -z \"$ac_aux_dir\"; then\r\n as_fn_error $? \"cannot find install-sh, install.sh, or shtool in \\\"$srcdir\\\" \\\"$srcdir/..\\\" \\\"$srcdir/../..\\\"\" \"$LINENO\" 5\r\nfi\r\n\r\n# These three variables are undocumented and unsupported,\r\n# and are intended to be withdrawn in a future Autoconf release.\r\n# They can cause serious problems if a builder's source tree is in a directory\r\n# whose full name contains unusual characters.\r\nac_config_guess=\"$SHELL $ac_aux_dir/config.guess\" # Please don't use this var.\r\nac_config_sub=\"$SHELL $ac_aux_dir/config.sub\" # Please don't use this var.\r\nac_configure=\"$SHELL $ac_aux_dir/configure\" # Please don't use this var.\r\n\r\n\r\n# Make sure we can run config.sub.\r\n$SHELL \"$ac_aux_dir/config.sub\" sun4 >/dev/null 2>&1 ||\r\n as_fn_error $? \"cannot run $SHELL $ac_aux_dir/config.sub\" \"$LINENO\" 5\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking build system type\" >&5\r\n$as_echo_n \"checking build system type... \" >&6; }\r\nif ${ac_cv_build+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_build_alias=$build_alias\r\ntest \"x$ac_build_alias\" = x &&\r\n ac_build_alias=`$SHELL \"$ac_aux_dir/config.guess\"`\r\ntest \"x$ac_build_alias\" = x &&\r\n as_fn_error $? \"cannot guess build type; you must specify one\" \"$LINENO\" 5\r\nac_cv_build=`$SHELL \"$ac_aux_dir/config.sub\" $ac_build_alias` ||\r\n as_fn_error $? \"$SHELL $ac_aux_dir/config.sub $ac_build_alias failed\" \"$LINENO\" 5\r\n\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_build\" >&5\r\n$as_echo \"$ac_cv_build\" >&6; }\r\ncase $ac_cv_build in\r\n*-*-*) ;;\r\n*) as_fn_error $? \"invalid value of canonical build\" \"$LINENO\" 5;;\r\nesac\r\nbuild=$ac_cv_build\r\nac_save_IFS=$IFS; IFS='-'\r\nset x $ac_cv_build\r\nshift\r\nbuild_cpu=$1\r\nbuild_vendor=$2\r\nshift; shift\r\n# Remember, the first character of IFS is used to create $*,\r\n# except with old shells:\r\nbuild_os=$*\r\nIFS=$ac_save_IFS\r\ncase $build_os in *\\ *) build_os=`echo \"$build_os\" | sed 's/ /-/g'`;; esac\r\n\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking host system type\" >&5\r\n$as_echo_n \"checking host system type... \" >&6; }\r\nif ${ac_cv_host+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test \"x$host_alias\" = x; then\r\n ac_cv_host=$ac_cv_build\r\nelse\r\n ac_cv_host=`$SHELL \"$ac_aux_dir/config.sub\" $host_alias` ||\r\n as_fn_error $? \"$SHELL $ac_aux_dir/config.sub $host_alias failed\" \"$LINENO\" 5\r\nfi\r\n\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_host\" >&5\r\n$as_echo \"$ac_cv_host\" >&6; }\r\ncase $ac_cv_host in\r\n*-*-*) ;;\r\n*) as_fn_error $? \"invalid value of canonical host\" \"$LINENO\" 5;;\r\nesac\r\nhost=$ac_cv_host\r\nac_save_IFS=$IFS; IFS='-'\r\nset x $ac_cv_host\r\nshift\r\nhost_cpu=$1\r\nhost_vendor=$2\r\nshift; shift\r\n# Remember, the first character of IFS is used to create $*,\r\n# except with old shells:\r\nhost_os=$*\r\nIFS=$ac_save_IFS\r\ncase $host_os in *\\ *) host_os=`echo \"$host_os\" | sed 's/ /-/g'`;; esac\r\n\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking target system type\" >&5\r\n$as_echo_n \"checking target system type... \" >&6; }\r\nif ${ac_cv_target+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test \"x$target_alias\" = x; then\r\n ac_cv_target=$ac_cv_host\r\nelse\r\n ac_cv_target=`$SHELL \"$ac_aux_dir/config.sub\" $target_alias` ||\r\n as_fn_error $? \"$SHELL $ac_aux_dir/config.sub $target_alias failed\" \"$LINENO\" 5\r\nfi\r\n\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_target\" >&5\r\n$as_echo \"$ac_cv_target\" >&6; }\r\ncase $ac_cv_target in\r\n*-*-*) ;;\r\n*) as_fn_error $? \"invalid value of canonical target\" \"$LINENO\" 5;;\r\nesac\r\ntarget=$ac_cv_target\r\nac_save_IFS=$IFS; IFS='-'\r\nset x $ac_cv_target\r\nshift\r\ntarget_cpu=$1\r\ntarget_vendor=$2\r\nshift; shift\r\n# Remember, the first character of IFS is used to create $*,\r\n# except with old shells:\r\ntarget_os=$*\r\nIFS=$ac_save_IFS\r\ncase $target_os in *\\ *) target_os=`echo \"$target_os\" | sed 's/ /-/g'`;; esac\r\n\r\n\r\n# The aliases save the names the user supplied, while $host etc.\r\n# will get canonicalized.\r\ntest -n \"$target_alias\" &&\r\n test \"$program_prefix$program_suffix$program_transform_name\" = \\\r\n NONENONEs,x,x, &&\r\n program_prefix=${target_alias}-\r\n\r\nac_ext=c\r\nac_cpp='$CPP $CPPFLAGS'\r\nac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5'\r\nac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'\r\nac_compiler_gnu=$ac_cv_c_compiler_gnu\r\nif test -n \"$ac_tool_prefix\"; then\r\n # Extract the first word of \"${ac_tool_prefix}gcc\", so it can be a program name with args.\r\nset dummy ${ac_tool_prefix}gcc; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_prog_CC+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -n \"$CC\"; then\r\n ac_cv_prog_CC=\"$CC\" # Let the user override the test.\r\nelse\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_prog_CC=\"${ac_tool_prefix}gcc\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\nfi\r\nfi\r\nCC=$ac_cv_prog_CC\r\nif test -n \"$CC\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $CC\" >&5\r\n$as_echo \"$CC\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n\r\nfi\r\nif test -z \"$ac_cv_prog_CC\"; then\r\n ac_ct_CC=$CC\r\n # Extract the first word of \"gcc\", so it can be a program name with args.\r\nset dummy gcc; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_prog_ac_ct_CC+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -n \"$ac_ct_CC\"; then\r\n ac_cv_prog_ac_ct_CC=\"$ac_ct_CC\" # Let the user override the test.\r\nelse\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_prog_ac_ct_CC=\"gcc\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\nfi\r\nfi\r\nac_ct_CC=$ac_cv_prog_ac_ct_CC\r\nif test -n \"$ac_ct_CC\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_ct_CC\" >&5\r\n$as_echo \"$ac_ct_CC\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n if test \"x$ac_ct_CC\" = x; then\r\n CC=\"\"\r\n else\r\n case $cross_compiling:$ac_tool_warned in\r\nyes:)\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet\" >&5\r\n$as_echo \"$as_me: WARNING: using cross tools not prefixed with host triplet\" >&2;}\r\nac_tool_warned=yes ;;\r\nesac\r\n CC=$ac_ct_CC\r\n fi\r\nelse\r\n CC=\"$ac_cv_prog_CC\"\r\nfi\r\n\r\nif test -z \"$CC\"; then\r\n if test -n \"$ac_tool_prefix\"; then\r\n # Extract the first word of \"${ac_tool_prefix}cc\", so it can be a program name with args.\r\nset dummy ${ac_tool_prefix}cc; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_prog_CC+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -n \"$CC\"; then\r\n ac_cv_prog_CC=\"$CC\" # Let the user override the test.\r\nelse\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_prog_CC=\"${ac_tool_prefix}cc\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\nfi\r\nfi\r\nCC=$ac_cv_prog_CC\r\nif test -n \"$CC\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $CC\" >&5\r\n$as_echo \"$CC\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n\r\n fi\r\nfi\r\nif test -z \"$CC\"; then\r\n # Extract the first word of \"cc\", so it can be a program name with args.\r\nset dummy cc; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_prog_CC+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -n \"$CC\"; then\r\n ac_cv_prog_CC=\"$CC\" # Let the user override the test.\r\nelse\r\n ac_prog_rejected=no\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n if test \"$as_dir/$ac_word$ac_exec_ext\" = \"/usr/ucb/cc\"; then\r\n ac_prog_rejected=yes\r\n continue\r\n fi\r\n ac_cv_prog_CC=\"cc\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\nif test $ac_prog_rejected = yes; then\r\n # We found a bogon in the path, so make sure we never use it.\r\n set dummy $ac_cv_prog_CC\r\n shift\r\n if test $# != 0; then\r\n # We chose a different compiler from the bogus one.\r\n # However, it has the same basename, so the bogon will be chosen\r\n # first if we set CC to just the basename; use the full file name.\r\n shift\r\n ac_cv_prog_CC=\"$as_dir/$ac_word${1+' '}$@\"\r\n fi\r\nfi\r\nfi\r\nfi\r\nCC=$ac_cv_prog_CC\r\nif test -n \"$CC\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $CC\" >&5\r\n$as_echo \"$CC\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n\r\nfi\r\nif test -z \"$CC\"; then\r\n if test -n \"$ac_tool_prefix\"; then\r\n for ac_prog in cl.exe\r\n do\r\n # Extract the first word of \"$ac_tool_prefix$ac_prog\", so it can be a program name with args.\r\nset dummy $ac_tool_prefix$ac_prog; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_prog_CC+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -n \"$CC\"; then\r\n ac_cv_prog_CC=\"$CC\" # Let the user override the test.\r\nelse\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_prog_CC=\"$ac_tool_prefix$ac_prog\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\nfi\r\nfi\r\nCC=$ac_cv_prog_CC\r\nif test -n \"$CC\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $CC\" >&5\r\n$as_echo \"$CC\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n\r\n test -n \"$CC\" && break\r\n done\r\nfi\r\nif test -z \"$CC\"; then\r\n ac_ct_CC=$CC\r\n for ac_prog in cl.exe\r\ndo\r\n # Extract the first word of \"$ac_prog\", so it can be a program name with args.\r\nset dummy $ac_prog; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_prog_ac_ct_CC+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -n \"$ac_ct_CC\"; then\r\n ac_cv_prog_ac_ct_CC=\"$ac_ct_CC\" # Let the user override the test.\r\nelse\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_prog_ac_ct_CC=\"$ac_prog\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\nfi\r\nfi\r\nac_ct_CC=$ac_cv_prog_ac_ct_CC\r\nif test -n \"$ac_ct_CC\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_ct_CC\" >&5\r\n$as_echo \"$ac_ct_CC\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n\r\n test -n \"$ac_ct_CC\" && break\r\ndone\r\n\r\n if test \"x$ac_ct_CC\" = x; then\r\n CC=\"\"\r\n else\r\n case $cross_compiling:$ac_tool_warned in\r\nyes:)\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet\" >&5\r\n$as_echo \"$as_me: WARNING: using cross tools not prefixed with host triplet\" >&2;}\r\nac_tool_warned=yes ;;\r\nesac\r\n CC=$ac_ct_CC\r\n fi\r\nfi\r\n\r\nfi\r\n\r\n\r\ntest -z \"$CC\" && { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"no acceptable C compiler found in \\$PATH\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\n\r\n# Provide some information about the compiler.\r\n$as_echo \"$as_me:${as_lineno-$LINENO}: checking for C compiler version\" >&5\r\nset X $ac_compile\r\nac_compiler=$2\r\nfor ac_option in --version -v -V -qversion; do\r\n { { ac_try=\"$ac_compiler $ac_option >&5\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_compiler $ac_option >&5\") 2>conftest.err\r\n ac_status=$?\r\n if test -s conftest.err; then\r\n sed '10a\\\r\n... rest of stderr output deleted ...\r\n 10q' conftest.err >conftest.er1\r\n cat conftest.er1 >&5\r\n fi\r\n rm -f conftest.er1 conftest.err\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }\r\ndone\r\n\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nac_clean_files_save=$ac_clean_files\r\nac_clean_files=\"$ac_clean_files a.out a.out.dSYM a.exe b.out\"\r\n# Try to create an executable without -o first, disregard a.out.\r\n# It will help us diagnose broken compilers, and finding out an intuition\r\n# of exeext.\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether the C compiler works\" >&5\r\n$as_echo_n \"checking whether the C compiler works... \" >&6; }\r\nac_link_default=`$as_echo \"$ac_link\" | sed 's/ -o *conftest[^ ]*//'`\r\n\r\n# The possible output files:\r\nac_files=\"a.out conftest.exe conftest a.exe a_out.exe b.out conftest.*\"\r\n\r\nac_rmfiles=\r\nfor ac_file in $ac_files\r\ndo\r\n case $ac_file in\r\n *.$ac_ext | *.xcoff | *.tds | *.d | *.pdb | *.xSYM | *.bb | *.bbg | *.map | *.inf | *.dSYM | *.o | *.obj ) ;;\r\n * ) ac_rmfiles=\"$ac_rmfiles $ac_file\";;\r\n esac\r\ndone\r\nrm -f $ac_rmfiles\r\n\r\nif { { ac_try=\"$ac_link_default\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_link_default\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }; then :\r\n # Autoconf-2.13 could set the ac_cv_exeext variable to `no'.\r\n# So ignore a value of `no', otherwise this would lead to `EXEEXT = no'\r\n# in a Makefile. We should not override ac_cv_exeext if it was cached,\r\n# so that the user can short-circuit this test for compilers unknown to\r\n# Autoconf.\r\nfor ac_file in $ac_files ''\r\ndo\r\n test -f \"$ac_file\" || continue\r\n case $ac_file in\r\n *.$ac_ext | *.xcoff | *.tds | *.d | *.pdb | *.xSYM | *.bb | *.bbg | *.map | *.inf | *.dSYM | *.o | *.obj )\r\n\t;;\r\n [ab].out )\r\n\t# We found the default executable, but exeext='' is most\r\n\t# certainly right.\r\n\tbreak;;\r\n *.* )\r\n\tif test \"${ac_cv_exeext+set}\" = set && test \"$ac_cv_exeext\" != no;\r\n\tthen :; else\r\n\t ac_cv_exeext=`expr \"$ac_file\" : '[^.]*\\(\\..*\\)'`\r\n\tfi\r\n\t# We set ac_cv_exeext here because the later test for it is not\r\n\t# safe: cross compilers may not add the suffix if given an `-o'\r\n\t# argument, so we may need to know it at that point already.\r\n\t# Even if this section looks crufty: it has the advantage of\r\n\t# actually working.\r\n\tbreak;;\r\n * )\r\n\tbreak;;\r\n esac\r\ndone\r\ntest \"$ac_cv_exeext\" = no && ac_cv_exeext=\r\n\r\nelse\r\n ac_file=''\r\nfi\r\nif test -z \"$ac_file\"; then :\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n$as_echo \"$as_me: failed program was:\" >&5\r\nsed 's/^/| /' conftest.$ac_ext >&5\r\n\r\n{ { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error 77 \"C compiler cannot create executables\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for C compiler default output file name\" >&5\r\n$as_echo_n \"checking for C compiler default output file name... \" >&6; }\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_file\" >&5\r\n$as_echo \"$ac_file\" >&6; }\r\nac_exeext=$ac_cv_exeext\r\n\r\nrm -f -r a.out a.out.dSYM a.exe conftest$ac_cv_exeext b.out\r\nac_clean_files=$ac_clean_files_save\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for suffix of executables\" >&5\r\n$as_echo_n \"checking for suffix of executables... \" >&6; }\r\nif { { ac_try=\"$ac_link\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_link\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }; then :\r\n # If both `conftest.exe' and `conftest' are `present' (well, observable)\r\n# catch `conftest.exe'. For instance with Cygwin, `ls conftest' will\r\n# work properly (i.e., refer to `conftest.exe'), while it won't with\r\n# `rm'.\r\nfor ac_file in conftest.exe conftest conftest.*; do\r\n test -f \"$ac_file\" || continue\r\n case $ac_file in\r\n *.$ac_ext | *.xcoff | *.tds | *.d | *.pdb | *.xSYM | *.bb | *.bbg | *.map | *.inf | *.dSYM | *.o | *.obj ) ;;\r\n *.* ) ac_cv_exeext=`expr \"$ac_file\" : '[^.]*\\(\\..*\\)'`\r\n\t break;;\r\n * ) break;;\r\n esac\r\ndone\r\nelse\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot compute suffix of executables: cannot compile and link\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nfi\r\nrm -f conftest conftest$ac_cv_exeext\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_exeext\" >&5\r\n$as_echo \"$ac_cv_exeext\" >&6; }\r\n\r\nrm -f conftest.$ac_ext\r\nEXEEXT=$ac_cv_exeext\r\nac_exeext=$EXEEXT\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <stdio.h>\r\nint\r\nmain ()\r\n{\r\nFILE *f = fopen (\"conftest.out\", \"w\");\r\n return ferror (f) || fclose (f) != 0;\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nac_clean_files=\"$ac_clean_files conftest.out\"\r\n# Check that the compiler produces executables we can run. If not, either\r\n# the compiler is broken, or we cross compile.\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether we are cross compiling\" >&5\r\n$as_echo_n \"checking whether we are cross compiling... \" >&6; }\r\nif test \"$cross_compiling\" != yes; then\r\n { { ac_try=\"$ac_link\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_link\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }\r\n if { ac_try='./conftest$ac_cv_exeext'\r\n { { case \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_try\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }; }; then\r\n cross_compiling=no\r\n else\r\n if test \"$cross_compiling\" = maybe; then\r\n\tcross_compiling=yes\r\n else\r\n\t{ { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run C compiled programs.\r\nIf you meant to cross compile, use \\`--host'.\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\n fi\r\n fi\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $cross_compiling\" >&5\r\n$as_echo \"$cross_compiling\" >&6; }\r\n\r\nrm -f conftest.$ac_ext conftest$ac_cv_exeext conftest.out\r\nac_clean_files=$ac_clean_files_save\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for suffix of object files\" >&5\r\n$as_echo_n \"checking for suffix of object files... \" >&6; }\r\nif ${ac_cv_objext+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nrm -f conftest.o conftest.obj\r\nif { { ac_try=\"$ac_compile\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_compile\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }; then :\r\n for ac_file in conftest.o conftest.obj conftest.*; do\r\n test -f \"$ac_file\" || continue;\r\n case $ac_file in\r\n *.$ac_ext | *.xcoff | *.tds | *.d | *.pdb | *.xSYM | *.bb | *.bbg | *.map | *.inf | *.dSYM ) ;;\r\n *) ac_cv_objext=`expr \"$ac_file\" : '.*\\.\\(.*\\)'`\r\n break;;\r\n esac\r\ndone\r\nelse\r\n $as_echo \"$as_me: failed program was:\" >&5\r\nsed 's/^/| /' conftest.$ac_ext >&5\r\n\r\n{ { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot compute suffix of object files: cannot compile\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nfi\r\nrm -f conftest.$ac_cv_objext conftest.$ac_ext\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_objext\" >&5\r\n$as_echo \"$ac_cv_objext\" >&6; }\r\nOBJEXT=$ac_cv_objext\r\nac_objext=$OBJEXT\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether we are using the GNU C compiler\" >&5\r\n$as_echo_n \"checking whether we are using the GNU C compiler... \" >&6; }\r\nif ${ac_cv_c_compiler_gnu+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n#ifndef __GNUC__\r\n choke me\r\n#endif\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_c_try_compile \"$LINENO\"; then :\r\n ac_compiler_gnu=yes\r\nelse\r\n ac_compiler_gnu=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\nac_cv_c_compiler_gnu=$ac_compiler_gnu\r\n\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_c_compiler_gnu\" >&5\r\n$as_echo \"$ac_cv_c_compiler_gnu\" >&6; }\r\nif test $ac_compiler_gnu = yes; then\r\n GCC=yes\r\nelse\r\n GCC=\r\nfi\r\nac_test_CFLAGS=${CFLAGS+set}\r\nac_save_CFLAGS=$CFLAGS\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether $CC accepts -g\" >&5\r\n$as_echo_n \"checking whether $CC accepts -g... \" >&6; }\r\nif ${ac_cv_prog_cc_g+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_save_c_werror_flag=$ac_c_werror_flag\r\n ac_c_werror_flag=yes\r\n ac_cv_prog_cc_g=no\r\n CFLAGS=\"-g\"\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_c_try_compile \"$LINENO\"; then :\r\n ac_cv_prog_cc_g=yes\r\nelse\r\n CFLAGS=\"\"\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_c_try_compile \"$LINENO\"; then :\r\n\r\nelse\r\n ac_c_werror_flag=$ac_save_c_werror_flag\r\n\t CFLAGS=\"-g\"\r\n\t cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_c_try_compile \"$LINENO\"; then :\r\n ac_cv_prog_cc_g=yes\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\n ac_c_werror_flag=$ac_save_c_werror_flag\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_prog_cc_g\" >&5\r\n$as_echo \"$ac_cv_prog_cc_g\" >&6; }\r\nif test \"$ac_test_CFLAGS\" = set; then\r\n CFLAGS=$ac_save_CFLAGS\r\nelif test $ac_cv_prog_cc_g = yes; then\r\n if test \"$GCC\" = yes; then\r\n CFLAGS=\"-g -O2\"\r\n else\r\n CFLAGS=\"-g\"\r\n fi\r\nelse\r\n if test \"$GCC\" = yes; then\r\n CFLAGS=\"-O2\"\r\n else\r\n CFLAGS=\r\n fi\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $CC option to accept ISO C89\" >&5\r\n$as_echo_n \"checking for $CC option to accept ISO C89... \" >&6; }\r\nif ${ac_cv_prog_cc_c89+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_cv_prog_cc_c89=no\r\nac_save_CC=$CC\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <stdarg.h>\r\n#include <stdio.h>\r\nstruct stat;\r\n/* Most of the following tests are stolen from RCS 5.7's src/conf.sh. */\r\nstruct buf { int x; };\r\nFILE * (*rcsopen) (struct buf *, struct stat *, int);\r\nstatic char *e (p, i)\r\n char **p;\r\n int i;\r\n{\r\n return p[i];\r\n}\r\nstatic char *f (char * (*g) (char **, int), char **p, ...)\r\n{\r\n char *s;\r\n va_list v;\r\n va_start (v,p);\r\n s = g (p, va_arg (v,int));\r\n va_end (v);\r\n return s;\r\n}\r\n\r\n/* OSF 4.0 Compaq cc is some sort of almost-ANSI by default. It has\r\n function prototypes and stuff, but not '\\xHH' hex character constants.\r\n These don't provoke an error unfortunately, instead are silently treated\r\n as 'x'. The following induces an error, until -std is added to get\r\n proper ANSI mode. Curiously '\\x00'!='x' always comes out true, for an\r\n array size at least. It's necessary to write '\\x00'==0 to get something\r\n that's true only with -std. */\r\nint osf4_cc_array ['\\x00' == 0 ? 1 : -1];\r\n\r\n/* IBM C 6 for AIX is almost-ANSI by default, but it replaces macro parameters\r\n inside strings and character constants. */\r\n#define FOO(x) 'x'\r\nint xlc6_cc_array[FOO(a) == 'x' ? 1 : -1];\r\n\r\nint test (int i, double x);\r\nstruct s1 {int (*f) (int a);};\r\nstruct s2 {int (*f) (double a);};\r\nint pairnames (int, char **, FILE *(*)(struct buf *, struct stat *, int), int, int);\r\nint argc;\r\nchar **argv;\r\nint\r\nmain ()\r\n{\r\nreturn f (e, argv, 0) != argv[0] || f (e, argv, 1) != argv[1];\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nfor ac_arg in '' -qlanglvl=extc89 -qlanglvl=ansi -std \\\r\n\t-Ae \"-Aa -D_HPUX_SOURCE\" \"-Xc -D__EXTENSIONS__\"\r\ndo\r\n CC=\"$ac_save_CC $ac_arg\"\r\n if ac_fn_c_try_compile \"$LINENO\"; then :\r\n ac_cv_prog_cc_c89=$ac_arg\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext\r\n test \"x$ac_cv_prog_cc_c89\" != \"xno\" && break\r\ndone\r\nrm -f conftest.$ac_ext\r\nCC=$ac_save_CC\r\n\r\nfi\r\n# AC_CACHE_VAL\r\ncase \"x$ac_cv_prog_cc_c89\" in\r\n x)\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: none needed\" >&5\r\n$as_echo \"none needed\" >&6; } ;;\r\n xno)\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: unsupported\" >&5\r\n$as_echo \"unsupported\" >&6; } ;;\r\n *)\r\n CC=\"$CC $ac_cv_prog_cc_c89\"\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_prog_cc_c89\" >&5\r\n$as_echo \"$ac_cv_prog_cc_c89\" >&6; } ;;\r\nesac\r\nif test \"x$ac_cv_prog_cc_c89\" != xno; then :\r\n\r\nfi\r\n\r\nac_ext=c\r\nac_cpp='$CPP $CPPFLAGS'\r\nac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5'\r\nac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'\r\nac_compiler_gnu=$ac_cv_c_compiler_gnu\r\n\r\nac_ext=cpp\r\nac_cpp='$CXXCPP $CPPFLAGS'\r\nac_compile='$CXX -c $CXXFLAGS $CPPFLAGS conftest.$ac_ext >&5'\r\nac_link='$CXX -o conftest$ac_exeext $CXXFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'\r\nac_compiler_gnu=$ac_cv_cxx_compiler_gnu\r\nif test -z \"$CXX\"; then\r\n if test -n \"$CCC\"; then\r\n CXX=$CCC\r\n else\r\n if test -n \"$ac_tool_prefix\"; then\r\n for ac_prog in g++ c++ gpp aCC CC cxx cc++ cl.exe FCC KCC RCC xlC_r xlC\r\n do\r\n # Extract the first word of \"$ac_tool_prefix$ac_prog\", so it can be a program name with args.\r\nset dummy $ac_tool_prefix$ac_prog; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_prog_CXX+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -n \"$CXX\"; then\r\n ac_cv_prog_CXX=\"$CXX\" # Let the user override the test.\r\nelse\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_prog_CXX=\"$ac_tool_prefix$ac_prog\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\nfi\r\nfi\r\nCXX=$ac_cv_prog_CXX\r\nif test -n \"$CXX\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $CXX\" >&5\r\n$as_echo \"$CXX\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n\r\n test -n \"$CXX\" && break\r\n done\r\nfi\r\nif test -z \"$CXX\"; then\r\n ac_ct_CXX=$CXX\r\n for ac_prog in g++ c++ gpp aCC CC cxx cc++ cl.exe FCC KCC RCC xlC_r xlC\r\ndo\r\n # Extract the first word of \"$ac_prog\", so it can be a program name with args.\r\nset dummy $ac_prog; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_prog_ac_ct_CXX+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -n \"$ac_ct_CXX\"; then\r\n ac_cv_prog_ac_ct_CXX=\"$ac_ct_CXX\" # Let the user override the test.\r\nelse\r\nas_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_prog_ac_ct_CXX=\"$ac_prog\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\nfi\r\nfi\r\nac_ct_CXX=$ac_cv_prog_ac_ct_CXX\r\nif test -n \"$ac_ct_CXX\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_ct_CXX\" >&5\r\n$as_echo \"$ac_ct_CXX\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n\r\n test -n \"$ac_ct_CXX\" && break\r\ndone\r\n\r\n if test \"x$ac_ct_CXX\" = x; then\r\n CXX=\"g++\"\r\n else\r\n case $cross_compiling:$ac_tool_warned in\r\nyes:)\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet\" >&5\r\n$as_echo \"$as_me: WARNING: using cross tools not prefixed with host triplet\" >&2;}\r\nac_tool_warned=yes ;;\r\nesac\r\n CXX=$ac_ct_CXX\r\n fi\r\nfi\r\n\r\n fi\r\nfi\r\n# Provide some information about the compiler.\r\n$as_echo \"$as_me:${as_lineno-$LINENO}: checking for C++ compiler version\" >&5\r\nset X $ac_compile\r\nac_compiler=$2\r\nfor ac_option in --version -v -V -qversion; do\r\n { { ac_try=\"$ac_compiler $ac_option >&5\"\r\ncase \"(($ac_try\" in\r\n *\\\"* | *\\`* | *\\\\*) ac_try_echo=\\$ac_try;;\r\n *) ac_try_echo=$ac_try;;\r\nesac\r\neval ac_try_echo=\"\\\"\\$as_me:${as_lineno-$LINENO}: $ac_try_echo\\\"\"\r\n$as_echo \"$ac_try_echo\"; } >&5\r\n (eval \"$ac_compiler $ac_option >&5\") 2>conftest.err\r\n ac_status=$?\r\n if test -s conftest.err; then\r\n sed '10a\\\r\n... rest of stderr output deleted ...\r\n 10q' conftest.err >conftest.er1\r\n cat conftest.er1 >&5\r\n fi\r\n rm -f conftest.er1 conftest.err\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }\r\ndone\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether we are using the GNU C++ compiler\" >&5\r\n$as_echo_n \"checking whether we are using the GNU C++ compiler... \" >&6; }\r\nif ${ac_cv_cxx_compiler_gnu+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n#ifndef __GNUC__\r\n choke me\r\n#endif\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_compile \"$LINENO\"; then :\r\n ac_compiler_gnu=yes\r\nelse\r\n ac_compiler_gnu=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\nac_cv_cxx_compiler_gnu=$ac_compiler_gnu\r\n\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_cxx_compiler_gnu\" >&5\r\n$as_echo \"$ac_cv_cxx_compiler_gnu\" >&6; }\r\nif test $ac_compiler_gnu = yes; then\r\n GXX=yes\r\nelse\r\n GXX=\r\nfi\r\nac_test_CXXFLAGS=${CXXFLAGS+set}\r\nac_save_CXXFLAGS=$CXXFLAGS\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether $CXX accepts -g\" >&5\r\n$as_echo_n \"checking whether $CXX accepts -g... \" >&6; }\r\nif ${ac_cv_prog_cxx_g+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_save_cxx_werror_flag=$ac_cxx_werror_flag\r\n ac_cxx_werror_flag=yes\r\n ac_cv_prog_cxx_g=no\r\n CXXFLAGS=\"-g\"\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_compile \"$LINENO\"; then :\r\n ac_cv_prog_cxx_g=yes\r\nelse\r\n CXXFLAGS=\"\"\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_compile \"$LINENO\"; then :\r\n\r\nelse\r\n ac_cxx_werror_flag=$ac_save_cxx_werror_flag\r\n\t CXXFLAGS=\"-g\"\r\n\t cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_compile \"$LINENO\"; then :\r\n ac_cv_prog_cxx_g=yes\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\n ac_cxx_werror_flag=$ac_save_cxx_werror_flag\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_prog_cxx_g\" >&5\r\n$as_echo \"$ac_cv_prog_cxx_g\" >&6; }\r\nif test \"$ac_test_CXXFLAGS\" = set; then\r\n CXXFLAGS=$ac_save_CXXFLAGS\r\nelif test $ac_cv_prog_cxx_g = yes; then\r\n if test \"$GXX\" = yes; then\r\n CXXFLAGS=\"-g -O2\"\r\n else\r\n CXXFLAGS=\"-g\"\r\n fi\r\nelse\r\n if test \"$GXX\" = yes; then\r\n CXXFLAGS=\"-O2\"\r\n else\r\n CXXFLAGS=\r\n fi\r\nfi\r\nac_ext=c\r\nac_cpp='$CPP $CPPFLAGS'\r\nac_compile='$CC -c $CFLAGS $CPPFLAGS conftest.$ac_ext >&5'\r\nac_link='$CC -o conftest$ac_exeext $CFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'\r\nac_compiler_gnu=$ac_cv_c_compiler_gnu\r\n\r\nac_ext=cpp\r\nac_cpp='$CXXCPP $CPPFLAGS'\r\nac_compile='$CXX -c $CXXFLAGS $CPPFLAGS conftest.$ac_ext >&5'\r\nac_link='$CXX -o conftest$ac_exeext $CXXFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'\r\nac_compiler_gnu=$ac_cv_cxx_compiler_gnu\r\n\r\n\r\nS9XFLGS=\"\"\r\nS9XDEFS=\"\"\r\nS9XLIBS=\"\"\r\n\r\n\r\n\r\n# *****************************\r\n# *** Execution begins here ***\r\n# *****************************\r\n\r\n# Test what compiler flags we should use.\r\n\r\n# Check whether --enable-debug was given.\r\nif test \"${enable_debug+set}\" = set; then :\r\n enableval=$enable_debug;\r\nelse\r\n enable_debug=\"no\"\r\nfi\r\n\r\n\r\nif test \"x$enable_debug\" = \"xyes\"; then\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -g\" >&5\r\n$as_echo_n \"checking whether g++ accepts -g... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_g+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -g\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_g=\"yes\"\r\nelse\r\n snes9x_cv_option_g=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_g\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -g\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -O0\" >&5\r\n$as_echo_n \"checking whether g++ accepts -O0... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_o0+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -O0\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_o0=\"yes\"\r\nelse\r\n snes9x_cv_option_o0=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_o0\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -O0\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\nelse\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -O3\" >&5\r\n$as_echo_n \"checking whether g++ accepts -O3... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_o3+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -O3\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_o3=\"yes\"\r\nelse\r\n snes9x_cv_option_o3=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_o3\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -O3\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -O2\" >&5\r\n$as_echo_n \"checking whether g++ accepts -O2... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_o2+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -O2\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_o2=\"yes\"\r\nelse\r\n snes9x_cv_option_o2=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_o2\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -O2\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -O1\" >&5\r\n$as_echo_n \"checking whether g++ accepts -O1... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_o1+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -O1\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_o1=\"yes\"\r\nelse\r\n snes9x_cv_option_o1=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_o1\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -O1\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\tfi\r\n\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -fomit-frame-pointer\" >&5\r\n$as_echo_n \"checking whether g++ accepts -fomit-frame-pointer... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_omit_frame_pointer+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -fomit-frame-pointer\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_omit_frame_pointer=\"yes\"\r\nelse\r\n snes9x_cv_option_omit_frame_pointer=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_omit_frame_pointer\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -fomit-frame-pointer\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\nfi\r\n\r\n# Check whether --enable-mtune was given.\r\nif test \"${enable_mtune+set}\" = set; then :\r\n enableval=$enable_mtune;\r\nelse\r\n enable_mtune=\"no\"\r\nfi\r\n\r\n\r\nif test \"x$enable_mtune\" != \"xno\"; then\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -mtune=\\\"$enable_mtune\\\"\" >&5\r\n$as_echo_n \"checking whether g++ accepts -mtune=\\\"$enable_mtune\\\"... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_mtune+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -mtune=\"$enable_mtune\"\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_mtune=\"yes\"\r\nelse\r\n snes9x_cv_option_mtune=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_mtune\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -mtune=\"$enable_mtune\"\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: -mtune failed, trying -mcpu...\" >&5\r\n$as_echo \"$as_me: WARNING: -mtune failed, trying -mcpu...\" >&2;}\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -mcpu=\\\"$enable_mtune\\\"\" >&5\r\n$as_echo_n \"checking whether g++ accepts -mcpu=\\\"$enable_mtune\\\"... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_mcpu+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -mcpu=\"$enable_mtune\"\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_mcpu=\"yes\"\r\nelse\r\n snes9x_cv_option_mcpu=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_mcpu\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -mcpu=\"$enable_mtune\"\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\t\tas_fn_error $? \"Please specify a working value for --enable-mtune.\" \"$LINENO\" 5\r\n\tfi\r\n\r\n\r\n\tfi\r\n\r\nfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -fno-exceptions\" >&5\r\n$as_echo_n \"checking whether g++ accepts -fno-exceptions... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_no_exceptions+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -fno-exceptions\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_no_exceptions=\"yes\"\r\nelse\r\n snes9x_cv_option_no_exceptions=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_no_exceptions\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -fno-exceptions\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -fno-rtti\" >&5\r\n$as_echo_n \"checking whether g++ accepts -fno-rtti... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_no_rtti+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -fno-rtti\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_no_rtti=\"yes\"\r\nelse\r\n snes9x_cv_option_no_rtti=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_no_rtti\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -fno-rtti\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -pedantic\" >&5\r\n$as_echo_n \"checking whether g++ accepts -pedantic... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_pedantic+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -pedantic\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_pedantic=\"yes\"\r\nelse\r\n snes9x_cv_option_pedantic=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_pedantic\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -pedantic\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -Wall\" >&5\r\n$as_echo_n \"checking whether g++ accepts -Wall... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_Wall+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -Wall\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_Wall=\"yes\"\r\nelse\r\n snes9x_cv_option_Wall=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_Wall\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -Wall\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -W\" >&5\r\n$as_echo_n \"checking whether g++ accepts -W... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_W+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -W\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_W=\"yes\"\r\nelse\r\n snes9x_cv_option_W=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_W\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -W\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -Wno-unused-parameter\" >&5\r\n$as_echo_n \"checking whether g++ accepts -Wno-unused-parameter... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_Wno_unused_parameter+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -Wno-unused-parameter\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_Wno_unused_parameter=\"yes\"\r\nelse\r\n snes9x_cv_option_Wno_unused_parameter=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_Wno_unused_parameter\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -Wno-unused-parameter\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\r\n# Enable SSE4.1\r\n# Check whether --enable-sse41 was given.\r\nif test \"${enable_sse41+set}\" = set; then :\r\n enableval=$enable_sse41;\r\nelse\r\n enable_sse41=\"no\"\r\nfi\r\n\r\n\r\nif test \"x$enable_sse41\" = \"xyes\"; then\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -msse4.1\" >&5\r\n$as_echo_n \"checking whether g++ accepts -msse4.1... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_sse41+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -msse4.1\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_sse41=\"yes\"\r\nelse\r\n snes9x_cv_option_sse41=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_sse41\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -msse4.1\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\tif test \"x$snes9x_cv_option_sse41\" != \"xyes\"; then\r\n\t\tenable_sse41=\"no\"\r\n\tfi\r\nfi\r\n\r\n# Enable AVX2\r\n# Check whether --enable-avx2 was given.\r\nif test \"${enable_avx2+set}\" = set; then :\r\n enableval=$enable_avx2;\r\nelse\r\n enable_avx2=\"no\"\r\nfi\r\n\r\n\r\nif test \"x$enable_avx2\" = \"xyes\"; then\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -mavx2\" >&5\r\n$as_echo_n \"checking whether g++ accepts -mavx2... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_avx2+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -mavx2\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_avx2=\"yes\"\r\nelse\r\n snes9x_cv_option_avx2=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_avx2\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -mavx2\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\tif test \"x$snes9x_cv_option_avx2\" != \"xyes\"; then\r\n\t\tenable_avx2=\"no\"\r\n\tfi\r\nfi\r\n\r\n# Enable ARM NEON\r\n# Check whether --enable-neon was given.\r\nif test \"${enable_neon+set}\" = set; then :\r\n enableval=$enable_neon;\r\nelse\r\n enable_neon=\"no\"\r\nfi\r\n\r\n\r\nif test \"x$enable_neon\" = \"xyes\"; then\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether g++ accepts -mfpu=neon\" >&5\r\n$as_echo_n \"checking whether g++ accepts -mfpu=neon... \" >&6; }\r\n\r\n\tif ${snes9x_cv_option_neon+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\t\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\t\tCXXFLAGS=\"$OLD_CXXFLAGS -mfpu=neon\"\r\n\r\n\t\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t\tint\tfoo;\r\n\r\n\t\t\tint\tmain (int argc, char **argv)\r\n\t\t\t{\r\n\t\t\t\t/* The following code triggs gcc:s generation of aline opcodes,\r\n\t\t\t\t which some versions of as does not support. */\r\n\r\n\t\t\t\tif (argc > 0)\r\n\t\t\t\t\targc = 0;\r\n\r\n\t\t\t\treturn (argc);\r\n\t\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_cv_option_neon=\"yes\"\r\nelse\r\n snes9x_cv_option_neon=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_cv_option_neon\" = \"xyes\"; then\r\n\t\tS9XFLGS=\"$S9XFLGS -mfpu=neon\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\n\tfi\r\n\r\n\tif test \"x$snes9x_cv_option_neon\" != \"xyes\"; then\r\n\t\tenable_neon=\"no\"\r\n\tfi\r\nfi\r\n\r\n# Check if the OS is Linux.\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether the OS is Linux\" >&5\r\n$as_echo_n \"checking whether the OS is Linux... \" >&6; }\r\n\r\nif ${snes9x_cv_linux_os+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\tcase \"$target\" in\r\n\t\t*-*-linux*)\r\n\t\t\tsnes9x_cv_linux_os=\"yes\"\r\n\t\t\t;;\r\n\t\t*)\r\n\t\t\tsnes9x_cv_linux_os=\"no\"\r\n\t\t\t;;\r\n\tesac\r\n\r\nfi\r\n\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $snes9x_cv_linux_os\" >&5\r\n$as_echo \"$snes9x_cv_linux_os\" >&6; }\r\n\r\n# Enable gamepad support on Linux.\r\n\r\n# Check whether --enable-gamepad was given.\r\nif test \"${enable_gamepad+set}\" = set; then :\r\n enableval=$enable_gamepad;\r\nelse\r\n enable_gamepad=\"yes\"\r\nfi\r\n\r\n\r\nif test \"x$enable_gamepad\" = \"xyes\"; then\r\n\tif test \"x$snes9x_cv_linux_os\" = \"xyes\"; then\r\n\t\tS9XDEFS=\"$S9XDEFS -DJOYSTICK_SUPPORT\"\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: Your OS is not Linux. Build without gamepad support.\" >&5\r\n$as_echo \"$as_me: WARNING: Your OS is not Linux. Build without gamepad support.\" >&2;}\r\n\t\tenable_gamepad=\"no\"\r\n\tfi\r\nfi\r\n\r\n# Enable debugger.\r\n\r\nS9XDEBUGGER=\"#S9XDEBUGGER=1\"\r\n\r\n# Check whether --enable-debugger was given.\r\nif test \"${enable_debugger+set}\" = set; then :\r\n enableval=$enable_debugger;\r\nelse\r\n enable_debugger=\"no\"\r\nfi\r\n\r\n\r\nif test \"x$enable_debugger\" = \"xyes\"; then\r\n\tS9XDEBUGGER=\"S9XDEBUGGER=1\"\r\n\tS9XDEFS=\"$S9XDEFS -DDEBUGGER\"\r\nfi\r\n\r\n# Enable netplay support if requested.\r\n\r\nS9XNETPLAY=\"#S9XNETPLAY=1\"\r\n\r\n# Check whether --enable-netplay was given.\r\nif test \"${enable_netplay+set}\" = set; then :\r\n enableval=$enable_netplay;\r\nelse\r\n enable_netplay=\"no\"\r\nfi\r\n\r\n\r\nif test \"x$enable_netplay\" = \"xyes\"; then\r\n\tS9XNETPLAY=\"S9XNETPLAY=1\"\r\n\tS9XDEFS=\"$S9XDEFS -DNETPLAY_SUPPORT\"\r\nfi\r\n\r\n# Enable GZIP support through zlib.\r\n\r\nac_ext=cpp\r\nac_cpp='$CXXCPP $CPPFLAGS'\r\nac_compile='$CXX -c $CXXFLAGS $CPPFLAGS conftest.$ac_ext >&5'\r\nac_link='$CXX -o conftest$ac_exeext $CXXFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'\r\nac_compiler_gnu=$ac_cv_cxx_compiler_gnu\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking how to run the C++ preprocessor\" >&5\r\n$as_echo_n \"checking how to run the C++ preprocessor... \" >&6; }\r\nif test -z \"$CXXCPP\"; then\r\n if ${ac_cv_prog_CXXCPP+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n # Double quotes because CXXCPP needs to be expanded\r\n for CXXCPP in \"$CXX -E\" \"/lib/cpp\"\r\n do\r\n ac_preproc_ok=false\r\nfor ac_cxx_preproc_warn_flag in '' yes\r\ndo\r\n # Use a header file that comes with gcc, so configuring glibc\r\n # with a fresh cross-compiler works.\r\n # Prefer <limits.h> to <assert.h> if __STDC__ is defined, since\r\n # <limits.h> exists even on freestanding compilers.\r\n # On the NeXT, cc -E runs the code through the compiler's parser,\r\n # not just through cpp. \"Syntax error\" is here to catch this case.\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#ifdef __STDC__\r\n# include <limits.h>\r\n#else\r\n# include <assert.h>\r\n#endif\r\n\t\t Syntax error\r\n_ACEOF\r\nif ac_fn_cxx_try_cpp \"$LINENO\"; then :\r\n\r\nelse\r\n # Broken: fails on valid input.\r\ncontinue\r\nfi\r\nrm -f conftest.err conftest.i conftest.$ac_ext\r\n\r\n # OK, works on sane cases. Now check whether nonexistent headers\r\n # can be detected and how.\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <ac_nonexistent.h>\r\n_ACEOF\r\nif ac_fn_cxx_try_cpp \"$LINENO\"; then :\r\n # Broken: success on invalid input.\r\ncontinue\r\nelse\r\n # Passes both tests.\r\nac_preproc_ok=:\r\nbreak\r\nfi\r\nrm -f conftest.err conftest.i conftest.$ac_ext\r\n\r\ndone\r\n# Because of `break', _AC_PREPROC_IFELSE's cleaning code was skipped.\r\nrm -f conftest.i conftest.err conftest.$ac_ext\r\nif $ac_preproc_ok; then :\r\n break\r\nfi\r\n\r\n done\r\n ac_cv_prog_CXXCPP=$CXXCPP\r\n\r\nfi\r\n CXXCPP=$ac_cv_prog_CXXCPP\r\nelse\r\n ac_cv_prog_CXXCPP=$CXXCPP\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $CXXCPP\" >&5\r\n$as_echo \"$CXXCPP\" >&6; }\r\nac_preproc_ok=false\r\nfor ac_cxx_preproc_warn_flag in '' yes\r\ndo\r\n # Use a header file that comes with gcc, so configuring glibc\r\n # with a fresh cross-compiler works.\r\n # Prefer <limits.h> to <assert.h> if __STDC__ is defined, since\r\n # <limits.h> exists even on freestanding compilers.\r\n # On the NeXT, cc -E runs the code through the compiler's parser,\r\n # not just through cpp. \"Syntax error\" is here to catch this case.\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#ifdef __STDC__\r\n# include <limits.h>\r\n#else\r\n# include <assert.h>\r\n#endif\r\n\t\t Syntax error\r\n_ACEOF\r\nif ac_fn_cxx_try_cpp \"$LINENO\"; then :\r\n\r\nelse\r\n # Broken: fails on valid input.\r\ncontinue\r\nfi\r\nrm -f conftest.err conftest.i conftest.$ac_ext\r\n\r\n # OK, works on sane cases. Now check whether nonexistent headers\r\n # can be detected and how.\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <ac_nonexistent.h>\r\n_ACEOF\r\nif ac_fn_cxx_try_cpp \"$LINENO\"; then :\r\n # Broken: success on invalid input.\r\ncontinue\r\nelse\r\n # Passes both tests.\r\nac_preproc_ok=:\r\nbreak\r\nfi\r\nrm -f conftest.err conftest.i conftest.$ac_ext\r\n\r\ndone\r\n# Because of `break', _AC_PREPROC_IFELSE's cleaning code was skipped.\r\nrm -f conftest.i conftest.err conftest.$ac_ext\r\nif $ac_preproc_ok; then :\r\n\r\nelse\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"C++ preprocessor \\\"$CXXCPP\\\" fails sanity check\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nfi\r\n\r\nac_ext=cpp\r\nac_cpp='$CXXCPP $CPPFLAGS'\r\nac_compile='$CXX -c $CXXFLAGS $CPPFLAGS conftest.$ac_ext >&5'\r\nac_link='$CXX -o conftest$ac_exeext $CXXFLAGS $CPPFLAGS $LDFLAGS conftest.$ac_ext $LIBS >&5'\r\nac_compiler_gnu=$ac_cv_cxx_compiler_gnu\r\n\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for grep that handles long lines and -e\" >&5\r\n$as_echo_n \"checking for grep that handles long lines and -e... \" >&6; }\r\nif ${ac_cv_path_GREP+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if test -z \"$GREP\"; then\r\n ac_path_GREP_found=false\r\n # Loop through the user's path and test for each of PROGNAME-LIST\r\n as_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH$PATH_SEPARATOR/usr/xpg4/bin\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_prog in grep ggrep; do\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n ac_path_GREP=\"$as_dir/$ac_prog$ac_exec_ext\"\r\n as_fn_executable_p \"$ac_path_GREP\" || continue\r\n# Check for GNU ac_path_GREP and select it if it is found.\r\n # Check for GNU $ac_path_GREP\r\ncase `\"$ac_path_GREP\" --version 2>&1` in\r\n*GNU*)\r\n ac_cv_path_GREP=\"$ac_path_GREP\" ac_path_GREP_found=:;;\r\n*)\r\n ac_count=0\r\n $as_echo_n 0123456789 >\"conftest.in\"\r\n while :\r\n do\r\n cat \"conftest.in\" \"conftest.in\" >\"conftest.tmp\"\r\n mv \"conftest.tmp\" \"conftest.in\"\r\n cp \"conftest.in\" \"conftest.nl\"\r\n $as_echo 'GREP' >> \"conftest.nl\"\r\n \"$ac_path_GREP\" -e 'GREP$' -e '-(cannot match)-' < \"conftest.nl\" >\"conftest.out\" 2>/dev/null || break\r\n diff \"conftest.out\" \"conftest.nl\" >/dev/null 2>&1 || break\r\n as_fn_arith $ac_count + 1 && ac_count=$as_val\r\n if test $ac_count -gt ${ac_path_GREP_max-0}; then\r\n # Best one so far, save it but keep looking for a better one\r\n ac_cv_path_GREP=\"$ac_path_GREP\"\r\n ac_path_GREP_max=$ac_count\r\n fi\r\n # 10*(2^10) chars as input seems more than enough\r\n test $ac_count -gt 10 && break\r\n done\r\n rm -f conftest.in conftest.tmp conftest.nl conftest.out;;\r\nesac\r\n\r\n $ac_path_GREP_found && break 3\r\n done\r\n done\r\n done\r\nIFS=$as_save_IFS\r\n if test -z \"$ac_cv_path_GREP\"; then\r\n as_fn_error $? \"no acceptable grep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin\" \"$LINENO\" 5\r\n fi\r\nelse\r\n ac_cv_path_GREP=$GREP\r\nfi\r\n\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_path_GREP\" >&5\r\n$as_echo \"$ac_cv_path_GREP\" >&6; }\r\n GREP=\"$ac_cv_path_GREP\"\r\n\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for egrep\" >&5\r\n$as_echo_n \"checking for egrep... \" >&6; }\r\nif ${ac_cv_path_EGREP+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n if echo a | $GREP -E '(a|b)' >/dev/null 2>&1\r\n then ac_cv_path_EGREP=\"$GREP -E\"\r\n else\r\n if test -z \"$EGREP\"; then\r\n ac_path_EGREP_found=false\r\n # Loop through the user's path and test for each of PROGNAME-LIST\r\n as_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH$PATH_SEPARATOR/usr/xpg4/bin\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_prog in egrep; do\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n ac_path_EGREP=\"$as_dir/$ac_prog$ac_exec_ext\"\r\n as_fn_executable_p \"$ac_path_EGREP\" || continue\r\n# Check for GNU ac_path_EGREP and select it if it is found.\r\n # Check for GNU $ac_path_EGREP\r\ncase `\"$ac_path_EGREP\" --version 2>&1` in\r\n*GNU*)\r\n ac_cv_path_EGREP=\"$ac_path_EGREP\" ac_path_EGREP_found=:;;\r\n*)\r\n ac_count=0\r\n $as_echo_n 0123456789 >\"conftest.in\"\r\n while :\r\n do\r\n cat \"conftest.in\" \"conftest.in\" >\"conftest.tmp\"\r\n mv \"conftest.tmp\" \"conftest.in\"\r\n cp \"conftest.in\" \"conftest.nl\"\r\n $as_echo 'EGREP' >> \"conftest.nl\"\r\n \"$ac_path_EGREP\" 'EGREP$' < \"conftest.nl\" >\"conftest.out\" 2>/dev/null || break\r\n diff \"conftest.out\" \"conftest.nl\" >/dev/null 2>&1 || break\r\n as_fn_arith $ac_count + 1 && ac_count=$as_val\r\n if test $ac_count -gt ${ac_path_EGREP_max-0}; then\r\n # Best one so far, save it but keep looking for a better one\r\n ac_cv_path_EGREP=\"$ac_path_EGREP\"\r\n ac_path_EGREP_max=$ac_count\r\n fi\r\n # 10*(2^10) chars as input seems more than enough\r\n test $ac_count -gt 10 && break\r\n done\r\n rm -f conftest.in conftest.tmp conftest.nl conftest.out;;\r\nesac\r\n\r\n $ac_path_EGREP_found && break 3\r\n done\r\n done\r\n done\r\nIFS=$as_save_IFS\r\n if test -z \"$ac_cv_path_EGREP\"; then\r\n as_fn_error $? \"no acceptable egrep could be found in $PATH$PATH_SEPARATOR/usr/xpg4/bin\" \"$LINENO\" 5\r\n fi\r\nelse\r\n ac_cv_path_EGREP=$EGREP\r\nfi\r\n\r\n fi\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_path_EGREP\" >&5\r\n$as_echo \"$ac_cv_path_EGREP\" >&6; }\r\n EGREP=\"$ac_cv_path_EGREP\"\r\n\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for ANSI C header files\" >&5\r\n$as_echo_n \"checking for ANSI C header files... \" >&6; }\r\nif ${ac_cv_header_stdc+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <stdlib.h>\r\n#include <stdarg.h>\r\n#include <string.h>\r\n#include <float.h>\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_compile \"$LINENO\"; then :\r\n ac_cv_header_stdc=yes\r\nelse\r\n ac_cv_header_stdc=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext conftest.$ac_ext\r\n\r\nif test $ac_cv_header_stdc = yes; then\r\n # SunOS 4.x string.h does not declare mem*, contrary to ANSI.\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <string.h>\r\n\r\n_ACEOF\r\nif (eval \"$ac_cpp conftest.$ac_ext\") 2>&5 |\r\n $EGREP \"memchr\" >/dev/null 2>&1; then :\r\n\r\nelse\r\n ac_cv_header_stdc=no\r\nfi\r\nrm -f conftest*\r\n\r\nfi\r\n\r\nif test $ac_cv_header_stdc = yes; then\r\n # ISC 2.0.2 stdlib.h does not declare free, contrary to ANSI.\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <stdlib.h>\r\n\r\n_ACEOF\r\nif (eval \"$ac_cpp conftest.$ac_ext\") 2>&5 |\r\n $EGREP \"free\" >/dev/null 2>&1; then :\r\n\r\nelse\r\n ac_cv_header_stdc=no\r\nfi\r\nrm -f conftest*\r\n\r\nfi\r\n\r\nif test $ac_cv_header_stdc = yes; then\r\n # /bin/cc in Irix-4.0.5 gets non-ANSI ctype macros unless using -ansi.\r\n if test \"$cross_compiling\" = yes; then :\r\n :\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <ctype.h>\r\n#include <stdlib.h>\r\n#if ((' ' & 0x0FF) == 0x020)\r\n# define ISLOWER(c) ('a' <= (c) && (c) <= 'z')\r\n# define TOUPPER(c) (ISLOWER(c) ? 'A' + ((c) - 'a') : (c))\r\n#else\r\n# define ISLOWER(c) \\\r\n\t\t (('a' <= (c) && (c) <= 'i') \\\r\n\t\t || ('j' <= (c) && (c) <= 'r') \\\r\n\t\t || ('s' <= (c) && (c) <= 'z'))\r\n# define TOUPPER(c) (ISLOWER(c) ? ((c) | 0x40) : (c))\r\n#endif\r\n\r\n#define XOR(e, f) (((e) && !(f)) || (!(e) && (f)))\r\nint\r\nmain ()\r\n{\r\n int i;\r\n for (i = 0; i < 256; i++)\r\n if (XOR (islower (i), ISLOWER (i))\r\n\t|| toupper (i) != TOUPPER (i))\r\n return 2;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n\r\nelse\r\n ac_cv_header_stdc=no\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\nfi\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_header_stdc\" >&5\r\n$as_echo \"$ac_cv_header_stdc\" >&6; }\r\nif test $ac_cv_header_stdc = yes; then\r\n\r\n$as_echo \"#define STDC_HEADERS 1\" >>confdefs.h\r\n\r\nfi\r\n\r\n# On IRIX 5.3, sys/types and inttypes.h are conflicting.\r\nfor ac_header in sys/types.h sys/stat.h stdlib.h string.h memory.h strings.h \\\r\n\t\t inttypes.h stdint.h unistd.h\r\ndo :\r\n as_ac_Header=`$as_echo \"ac_cv_header_$ac_header\" | $as_tr_sh`\r\nac_fn_cxx_check_header_compile \"$LINENO\" \"$ac_header\" \"$as_ac_Header\" \"$ac_includes_default\r\n\"\r\nif eval test \\\"x\\$\"$as_ac_Header\"\\\" = x\"yes\"; then :\r\n cat >>confdefs.h <<_ACEOF\r\n#define `$as_echo \"HAVE_$ac_header\" | $as_tr_cpp` 1\r\n_ACEOF\r\n\r\nfi\r\n\r\ndone\r\n\r\n\r\nif ${snes9x_cv_zlib+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\tac_fn_cxx_check_header_mongrel \"$LINENO\" \"zlib.h\" \"ac_cv_header_zlib_h\" \"$ac_includes_default\"\r\nif test \"x$ac_cv_header_zlib_h\" = xyes; then :\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for gzread in -lz\" >&5\r\n$as_echo_n \"checking for gzread in -lz... \" >&6; }\r\nif ${ac_cv_lib_z_gzread+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-lz $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar gzread ();\r\nint\r\nmain ()\r\n{\r\nreturn gzread ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_z_gzread=yes\r\nelse\r\n ac_cv_lib_z_gzread=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_z_gzread\" >&5\r\n$as_echo \"$ac_cv_lib_z_gzread\" >&6; }\r\nif test \"x$ac_cv_lib_z_gzread\" = xyes; then :\r\n snes9x_cv_zlib=\"yes\"\r\nelse\r\n snes9x_cv_zlib=\"no\"\r\nfi\r\n\r\nelse\r\n snes9x_cv_zlib=\"no\"\r\nfi\r\n\r\n\r\n\r\nfi\r\n\r\n\r\n# Check whether --enable-gzip was given.\r\nif test \"${enable_gzip+set}\" = set; then :\r\n enableval=$enable_gzip;\r\nelse\r\n enable_gzip=\"yes\"\r\nfi\r\n\r\n\r\nif test \"x$enable_gzip\" = \"xyes\"; then\r\n\tif test \"x$snes9x_cv_zlib\" = \"xyes\"; then\r\n\t\tS9XDEFS=\"$S9XDEFS -DZLIB\"\r\n\t\tS9XLIBS=\"$S9XLIBS -lz\"\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: zlib not found. Build without GZIP support.\" >&5\r\n$as_echo \"$as_me: WARNING: zlib not found. Build without GZIP support.\" >&2;}\r\n\t\tenable_gzip=\"no\"\r\n\tfi\r\nfi\r\n\r\n# Enable ZIP support through zlib.\r\n\r\nS9XZIP=\"#S9XZIP=1\"\r\n\r\n# Check whether --enable-zip was given.\r\nif test \"${enable_zip+set}\" = set; then :\r\n enableval=$enable_zip;\r\nelse\r\n enable_zip=\"yes\"\r\nfi\r\n\r\n\r\nS9X_SYSTEM_ZIP=\"#SYSTEM_ZIP=1\"\r\n\r\n\r\n# Check whether --with-system-zip was given.\r\nif test \"${with_system_zip+set}\" = set; then :\r\n withval=$with_system_zip;\r\nelse\r\n with_system_zip=\"check\"\r\nfi\r\n\r\n\r\nif test \"x$enable_zip\" = \"xyes\"; then\r\n\tif test \"x$with_system_zip\" != \"xno\"; then\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\nif test \"x$ac_cv_env_PKG_CONFIG_set\" != \"xset\"; then\r\n\tif test -n \"$ac_tool_prefix\"; then\r\n # Extract the first word of \"${ac_tool_prefix}pkg-config\", so it can be a program name with args.\r\nset dummy ${ac_tool_prefix}pkg-config; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_path_PKG_CONFIG+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n case $PKG_CONFIG in\r\n [\\\\/]* | ?:[\\\\/]*)\r\n ac_cv_path_PKG_CONFIG=\"$PKG_CONFIG\" # Let the user override the test with a path.\r\n ;;\r\n *)\r\n as_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_path_PKG_CONFIG=\"$as_dir/$ac_word$ac_exec_ext\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\n ;;\r\nesac\r\nfi\r\nPKG_CONFIG=$ac_cv_path_PKG_CONFIG\r\nif test -n \"$PKG_CONFIG\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $PKG_CONFIG\" >&5\r\n$as_echo \"$PKG_CONFIG\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n\r\nfi\r\nif test -z \"$ac_cv_path_PKG_CONFIG\"; then\r\n ac_pt_PKG_CONFIG=$PKG_CONFIG\r\n # Extract the first word of \"pkg-config\", so it can be a program name with args.\r\nset dummy pkg-config; ac_word=$2\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for $ac_word\" >&5\r\n$as_echo_n \"checking for $ac_word... \" >&6; }\r\nif ${ac_cv_path_ac_pt_PKG_CONFIG+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n case $ac_pt_PKG_CONFIG in\r\n [\\\\/]* | ?:[\\\\/]*)\r\n ac_cv_path_ac_pt_PKG_CONFIG=\"$ac_pt_PKG_CONFIG\" # Let the user override the test with a path.\r\n ;;\r\n *)\r\n as_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n for ac_exec_ext in '' $ac_executable_extensions; do\r\n if as_fn_executable_p \"$as_dir/$ac_word$ac_exec_ext\"; then\r\n ac_cv_path_ac_pt_PKG_CONFIG=\"$as_dir/$ac_word$ac_exec_ext\"\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: found $as_dir/$ac_word$ac_exec_ext\" >&5\r\n break 2\r\n fi\r\ndone\r\n done\r\nIFS=$as_save_IFS\r\n\r\n ;;\r\nesac\r\nfi\r\nac_pt_PKG_CONFIG=$ac_cv_path_ac_pt_PKG_CONFIG\r\nif test -n \"$ac_pt_PKG_CONFIG\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_pt_PKG_CONFIG\" >&5\r\n$as_echo \"$ac_pt_PKG_CONFIG\" >&6; }\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\nfi\r\n\r\n if test \"x$ac_pt_PKG_CONFIG\" = x; then\r\n PKG_CONFIG=\"\"\r\n else\r\n case $cross_compiling:$ac_tool_warned in\r\nyes:)\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: using cross tools not prefixed with host triplet\" >&5\r\n$as_echo \"$as_me: WARNING: using cross tools not prefixed with host triplet\" >&2;}\r\nac_tool_warned=yes ;;\r\nesac\r\n PKG_CONFIG=$ac_pt_PKG_CONFIG\r\n fi\r\nelse\r\n PKG_CONFIG=\"$ac_cv_path_PKG_CONFIG\"\r\nfi\r\n\r\nfi\r\nif test -n \"$PKG_CONFIG\"; then\r\n\t_pkg_min_version=0.9.0\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking pkg-config is at least version $_pkg_min_version\" >&5\r\n$as_echo_n \"checking pkg-config is at least version $_pkg_min_version... \" >&6; }\r\n\tif $PKG_CONFIG --atleast-pkgconfig-version $_pkg_min_version; then\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\t\tPKG_CONFIG=\"\"\r\n\tfi\r\nfi\r\n\r\npkg_failed=no\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for SYSTEM_ZIP\" >&5\r\n$as_echo_n \"checking for SYSTEM_ZIP... \" >&6; }\r\n\r\nif test -n \"$SYSTEM_ZIP_CFLAGS\"; then\r\n pkg_cv_SYSTEM_ZIP_CFLAGS=\"$SYSTEM_ZIP_CFLAGS\"\r\n elif test -n \"$PKG_CONFIG\"; then\r\n if test -n \"$PKG_CONFIG\" && \\\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: \\$PKG_CONFIG --exists --print-errors \\\"minizip\\\"\"; } >&5\r\n ($PKG_CONFIG --exists --print-errors \"minizip\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }; then\r\n pkg_cv_SYSTEM_ZIP_CFLAGS=`$PKG_CONFIG --cflags \"minizip\" 2>/dev/null`\r\n\t\t test \"x$?\" != \"x0\" && pkg_failed=yes\r\nelse\r\n pkg_failed=yes\r\nfi\r\n else\r\n pkg_failed=untried\r\nfi\r\nif test -n \"$SYSTEM_ZIP_LIBS\"; then\r\n pkg_cv_SYSTEM_ZIP_LIBS=\"$SYSTEM_ZIP_LIBS\"\r\n elif test -n \"$PKG_CONFIG\"; then\r\n if test -n \"$PKG_CONFIG\" && \\\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: \\$PKG_CONFIG --exists --print-errors \\\"minizip\\\"\"; } >&5\r\n ($PKG_CONFIG --exists --print-errors \"minizip\") 2>&5\r\n ac_status=$?\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: \\$? = $ac_status\" >&5\r\n test $ac_status = 0; }; then\r\n pkg_cv_SYSTEM_ZIP_LIBS=`$PKG_CONFIG --libs \"minizip\" 2>/dev/null`\r\n\t\t test \"x$?\" != \"x0\" && pkg_failed=yes\r\nelse\r\n pkg_failed=yes\r\nfi\r\n else\r\n pkg_failed=untried\r\nfi\r\n\r\n\r\n\r\nif test $pkg_failed = yes; then\r\n \t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\r\nif $PKG_CONFIG --atleast-pkgconfig-version 0.20; then\r\n _pkg_short_errors_supported=yes\r\nelse\r\n _pkg_short_errors_supported=no\r\nfi\r\n if test $_pkg_short_errors_supported = yes; then\r\n\t SYSTEM_ZIP_PKG_ERRORS=`$PKG_CONFIG --short-errors --print-errors --cflags --libs \"minizip\" 2>&1`\r\n else\r\n\t SYSTEM_ZIP_PKG_ERRORS=`$PKG_CONFIG --print-errors --cflags --libs \"minizip\" 2>&1`\r\n fi\r\n\t# Put the nasty error message in config.log where it belongs\r\n\techo \"$SYSTEM_ZIP_PKG_ERRORS\" >&5\r\n\r\n\tif test \"x${with_system_zip}\" != \"xcheck\"; then\r\n\t\t\t\tas_fn_error $? \"--with-system-zip requested but no proper minizip lib found.\" \"$LINENO\" 5\r\n\t\t\telse\r\n\t\t\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: minizip not found. Build without SYSTEM_ZIP support.\" >&5\r\n$as_echo \"$as_me: WARNING: minizip not found. Build without SYSTEM_ZIP support.\" >&2;}\r\n\t\t\tfi\r\n\r\nelif test $pkg_failed = untried; then\r\n \t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\tif test \"x${with_system_zip}\" != \"xcheck\"; then\r\n\t\t\t\tas_fn_error $? \"--with-system-zip requested but no proper minizip lib found.\" \"$LINENO\" 5\r\n\t\t\telse\r\n\t\t\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: minizip not found. Build without SYSTEM_ZIP support.\" >&5\r\n$as_echo \"$as_me: WARNING: minizip not found. Build without SYSTEM_ZIP support.\" >&2;}\r\n\t\t\tfi\r\n\r\nelse\r\n\tSYSTEM_ZIP_CFLAGS=$pkg_cv_SYSTEM_ZIP_CFLAGS\r\n\tSYSTEM_ZIP_LIBS=$pkg_cv_SYSTEM_ZIP_LIBS\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\tS9XZIP=\"S9XZIP=1\"\r\n\t\t\tS9XDEFS=\"$S9XDEFS -DUNZIP_SUPPORT\"\r\n\t\t\tS9X_SYSTEM_ZIP=\"SYSTEM_ZIP=1\"\r\n\t\t\tS9XLIBS=\"$S9XLIBS $SYSTEM_ZIP_LIBS\"\r\n\t\t\tif test \"x$enable_gzip\" = \"xno\"; then\r\n\t\t\t\tS9XLIBS=\"$S9XLIBS -lz\"\r\n\t\t\tfi\r\n\t\t\tS9XDEFS=\"$S9XDEFS -DSYSTEM_ZIP\"\r\nfi\r\n\telse\r\n\t\tif test \"x$snes9x_cv_zlib\" = \"xyes\"; then\r\n\t\t\tS9XZIP=\"S9XZIP=1\"\r\n\t\t\tS9XDEFS=\"$S9XDEFS -DUNZIP_SUPPORT\"\r\n\t\t\tif test \"x$enable_gzip\" = \"xno\"; then\r\n\t\t\t\tS9XLIBS=\"$S9XLIBS -lz\"\r\n\t\t\tfi\r\n\t\telse\r\n\t\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: zlib not found. Build without ZIP support.\" >&5\r\n$as_echo \"$as_me: WARNING: zlib not found. Build without ZIP support.\" >&2;}\r\n\t\t\tenable_zip=\"no\"\r\n\t\tfi\r\n\tfi\r\nfi\r\n\r\n# Enable JMA support.\r\n\r\nS9XJMA=\"#S9XJMA=1\"\r\n\r\n# Check whether --enable-jma was given.\r\nif test \"${enable_jma+set}\" = set; then :\r\n enableval=$enable_jma;\r\nelse\r\n enable_jma=\"yes\"\r\nfi\r\n\r\n\r\nif test \"x$enable_jma\" = \"xyes\"; then\r\n\tS9XJMA=\"S9XJMA=1\"\r\n\tS9XDEFS=\"$S9XDEFS -DJMA_SUPPORT\"\r\nfi\r\n\r\n# Enable screenshot support through libpng.\r\n\r\nif ${snes9x_cv_libpng+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n\r\n\tac_fn_cxx_check_header_mongrel \"$LINENO\" \"png.h\" \"ac_cv_header_png_h\" \"$ac_includes_default\"\r\nif test \"x$ac_cv_header_png_h\" = xyes; then :\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for png_init_io in -lpng\" >&5\r\n$as_echo_n \"checking for png_init_io in -lpng... \" >&6; }\r\nif ${ac_cv_lib_png_png_init_io+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-lpng $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar png_init_io ();\r\nint\r\nmain ()\r\n{\r\nreturn png_init_io ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_png_png_init_io=yes\r\nelse\r\n ac_cv_lib_png_png_init_io=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_png_png_init_io\" >&5\r\n$as_echo \"$ac_cv_lib_png_png_init_io\" >&6; }\r\nif test \"x$ac_cv_lib_png_png_init_io\" = xyes; then :\r\n snes9x_cv_libpng=\"yes\"\r\nelse\r\n snes9x_cv_libpng=\"no\"\r\nfi\r\n\r\nelse\r\n snes9x_cv_libpng=\"no\"\r\nfi\r\n\r\n\r\n\r\nfi\r\n\r\n\r\n# Check whether --enable-screenshot was given.\r\nif test \"${enable_screenshot+set}\" = set; then :\r\n enableval=$enable_screenshot;\r\nelse\r\n enable_screenshot=\"yes\"\r\nfi\r\n\r\n\r\nif test \"x$enable_screenshot\" = \"xyes\"; then\r\n\tif test \"x$snes9x_cv_libpng\" = \"xyes\"; then\r\n\t\tS9XDEFS=\"$S9XDEFS -DHAVE_LIBPNG\"\r\n\t\tS9XLIBS=\"$S9XLIBS -lpng\"\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: libpng not found. Build without screenshot support.\" >&5\r\n$as_echo \"$as_me: WARNING: libpng not found. Build without screenshot support.\" >&2;}\r\n\t\tenable_screenshot=\"no\"\r\n\tfi\r\nfi\r\n\r\n# Check for functions\r\n\r\nac_fn_cxx_check_func \"$LINENO\" \"mkstemp\" \"ac_cv_func_mkstemp\"\r\nif test \"x$ac_cv_func_mkstemp\" = xyes; then :\r\n\r\n\tS9XDEFS=\"$S9XDEFS -DHAVE_MKSTEMP\"\r\n\r\nfi\r\n\r\n\r\n# Check X11\r\n\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for X\" >&5\r\n$as_echo_n \"checking for X... \" >&6; }\r\n\r\n\r\n# Check whether --with-x was given.\r\nif test \"${with_x+set}\" = set; then :\r\n withval=$with_x;\r\nfi\r\n\r\n# $have_x is `yes', `no', `disabled', or empty when we do not yet know.\r\nif test \"x$with_x\" = xno; then\r\n # The user explicitly disabled X.\r\n have_x=disabled\r\nelse\r\n case $x_includes,$x_libraries in #(\r\n *\\'*) as_fn_error $? \"cannot use X directory names containing '\" \"$LINENO\" 5;; #(\r\n *,NONE | NONE,*) if ${ac_cv_have_x+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n # One or both of the vars are not set, and there is no cached value.\r\nac_x_includes=no ac_x_libraries=no\r\nrm -f -r conftest.dir\r\nif mkdir conftest.dir; then\r\n cd conftest.dir\r\n cat >Imakefile <<'_ACEOF'\r\nincroot:\r\n\t@echo incroot='${INCROOT}'\r\nusrlibdir:\r\n\t@echo usrlibdir='${USRLIBDIR}'\r\nlibdir:\r\n\t@echo libdir='${LIBDIR}'\r\n_ACEOF\r\n if (export CC; ${XMKMF-xmkmf}) >/dev/null 2>/dev/null && test -f Makefile; then\r\n # GNU make sometimes prints \"make[1]: Entering ...\", which would confuse us.\r\n for ac_var in incroot usrlibdir libdir; do\r\n eval \"ac_im_$ac_var=\\`\\${MAKE-make} $ac_var 2>/dev/null | sed -n 's/^$ac_var=//p'\\`\"\r\n done\r\n # Open Windows xmkmf reportedly sets LIBDIR instead of USRLIBDIR.\r\n for ac_extension in a so sl dylib la dll; do\r\n if test ! -f \"$ac_im_usrlibdir/libX11.$ac_extension\" &&\r\n\t test -f \"$ac_im_libdir/libX11.$ac_extension\"; then\r\n\tac_im_usrlibdir=$ac_im_libdir; break\r\n fi\r\n done\r\n # Screen out bogus values from the imake configuration. They are\r\n # bogus both because they are the default anyway, and because\r\n # using them would break gcc on systems where it needs fixed includes.\r\n case $ac_im_incroot in\r\n\t/usr/include) ac_x_includes= ;;\r\n\t*) test -f \"$ac_im_incroot/X11/Xos.h\" && ac_x_includes=$ac_im_incroot;;\r\n esac\r\n case $ac_im_usrlibdir in\r\n\t/usr/lib | /usr/lib64 | /lib | /lib64) ;;\r\n\t*) test -d \"$ac_im_usrlibdir\" && ac_x_libraries=$ac_im_usrlibdir ;;\r\n esac\r\n fi\r\n cd ..\r\n rm -f -r conftest.dir\r\nfi\r\n\r\n# Standard set of common directories for X headers.\r\n# Check X11 before X11Rn because it is often a symlink to the current release.\r\nac_x_header_dirs='\r\n/usr/X11/include\r\n/usr/X11R7/include\r\n/usr/X11R6/include\r\n/usr/X11R5/include\r\n/usr/X11R4/include\r\n\r\n/usr/include/X11\r\n/usr/include/X11R7\r\n/usr/include/X11R6\r\n/usr/include/X11R5\r\n/usr/include/X11R4\r\n\r\n/usr/local/X11/include\r\n/usr/local/X11R7/include\r\n/usr/local/X11R6/include\r\n/usr/local/X11R5/include\r\n/usr/local/X11R4/include\r\n\r\n/usr/local/include/X11\r\n/usr/local/include/X11R7\r\n/usr/local/include/X11R6\r\n/usr/local/include/X11R5\r\n/usr/local/include/X11R4\r\n\r\n/usr/X386/include\r\n/usr/x386/include\r\n/usr/XFree86/include/X11\r\n\r\n/usr/include\r\n/usr/local/include\r\n/usr/unsupported/include\r\n/usr/athena/include\r\n/usr/local/x11r5/include\r\n/usr/lpp/Xamples/include\r\n\r\n/usr/openwin/include\r\n/usr/openwin/share/include'\r\n\r\nif test \"$ac_x_includes\" = no; then\r\n # Guess where to find include files, by looking for Xlib.h.\r\n # First, try using that file with no special directory specified.\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <X11/Xlib.h>\r\n_ACEOF\r\nif ac_fn_cxx_try_cpp \"$LINENO\"; then :\r\n # We can compile using X headers with no special include directory.\r\nac_x_includes=\r\nelse\r\n for ac_dir in $ac_x_header_dirs; do\r\n if test -r \"$ac_dir/X11/Xlib.h\"; then\r\n ac_x_includes=$ac_dir\r\n break\r\n fi\r\ndone\r\nfi\r\nrm -f conftest.err conftest.i conftest.$ac_ext\r\nfi # $ac_x_includes = no\r\n\r\nif test \"$ac_x_libraries\" = no; then\r\n # Check for the libraries.\r\n # See if we find them without any special options.\r\n # Don't add to $LIBS permanently.\r\n ac_save_LIBS=$LIBS\r\n LIBS=\"-lX11 $LIBS\"\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n#include <X11/Xlib.h>\r\nint\r\nmain ()\r\n{\r\nXrmInitialize ()\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n LIBS=$ac_save_LIBS\r\n# We can link X programs with no special library path.\r\nac_x_libraries=\r\nelse\r\n LIBS=$ac_save_LIBS\r\nfor ac_dir in `$as_echo \"$ac_x_includes $ac_x_header_dirs\" | sed s/include/lib/g`\r\ndo\r\n # Don't even attempt the hair of trying to link an X program!\r\n for ac_extension in a so sl dylib la dll; do\r\n if test -r \"$ac_dir/libX11.$ac_extension\"; then\r\n ac_x_libraries=$ac_dir\r\n break 2\r\n fi\r\n done\r\ndone\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nfi # $ac_x_libraries = no\r\n\r\ncase $ac_x_includes,$ac_x_libraries in #(\r\n no,* | *,no | *\\'*)\r\n # Didn't find X, or a directory has \"'\" in its name.\r\n ac_cv_have_x=\"have_x=no\";; #(\r\n *)\r\n # Record where we found X for the cache.\r\n ac_cv_have_x=\"have_x=yes\\\r\n\tac_x_includes='$ac_x_includes'\\\r\n\tac_x_libraries='$ac_x_libraries'\"\r\nesac\r\nfi\r\n;; #(\r\n *) have_x=yes;;\r\n esac\r\n eval \"$ac_cv_have_x\"\r\nfi # $with_x != no\r\n\r\nif test \"$have_x\" != yes; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: $have_x\" >&5\r\n$as_echo \"$have_x\" >&6; }\r\n no_x=yes\r\nelse\r\n # If each of the values was on the command line, it overrides each guess.\r\n test \"x$x_includes\" = xNONE && x_includes=$ac_x_includes\r\n test \"x$x_libraries\" = xNONE && x_libraries=$ac_x_libraries\r\n # Update the cache value to reflect the command line values.\r\n ac_cv_have_x=\"have_x=yes\\\r\n\tac_x_includes='$x_includes'\\\r\n\tac_x_libraries='$x_libraries'\"\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: libraries $x_libraries, headers $x_includes\" >&5\r\n$as_echo \"libraries $x_libraries, headers $x_includes\" >&6; }\r\nfi\r\n\r\nif test \"$no_x\" = yes; then\r\n # Not all programs may use this symbol, but it does not hurt to define it.\r\n\r\n$as_echo \"#define X_DISPLAY_MISSING 1\" >>confdefs.h\r\n\r\n X_CFLAGS= X_PRE_LIBS= X_LIBS= X_EXTRA_LIBS=\r\nelse\r\n if test -n \"$x_includes\"; then\r\n X_CFLAGS=\"$X_CFLAGS -I$x_includes\"\r\n fi\r\n\r\n # It would also be nice to do this for all -L options, not just this one.\r\n if test -n \"$x_libraries\"; then\r\n X_LIBS=\"$X_LIBS -L$x_libraries\"\r\n # For Solaris; some versions of Sun CC require a space after -R and\r\n # others require no space. Words are not sufficient . . . .\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether -R must be followed by a space\" >&5\r\n$as_echo_n \"checking whether -R must be followed by a space... \" >&6; }\r\n ac_xsave_LIBS=$LIBS; LIBS=\"$LIBS -R$x_libraries\"\r\n ac_xsave_cxx_werror_flag=$ac_cxx_werror_flag\r\n ac_cxx_werror_flag=yes\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n X_LIBS=\"$X_LIBS -R$x_libraries\"\r\nelse\r\n LIBS=\"$ac_xsave_LIBS -R $x_libraries\"\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\nint\r\nmain ()\r\n{\r\n\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\t X_LIBS=\"$X_LIBS -R $x_libraries\"\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: result: neither works\" >&5\r\n$as_echo \"neither works\" >&6; }\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\n ac_cxx_werror_flag=$ac_xsave_cxx_werror_flag\r\n LIBS=$ac_xsave_LIBS\r\n fi\r\n\r\n # Check for system-dependent libraries X programs must link with.\r\n # Do this before checking for the system-independent R6 libraries\r\n # (-lICE), since we may need -lsocket or whatever for X linking.\r\n\r\n if test \"$ISC\" = yes; then\r\n X_EXTRA_LIBS=\"$X_EXTRA_LIBS -lnsl_s -linet\"\r\n else\r\n # Martyn Johnson says this is needed for Ultrix, if the X\r\n # libraries were built with DECnet support. And Karl Berry says\r\n # the Alpha needs dnet_stub (dnet does not exist).\r\n ac_xsave_LIBS=\"$LIBS\"; LIBS=\"$LIBS $X_LIBS -lX11\"\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar XOpenDisplay ();\r\nint\r\nmain ()\r\n{\r\nreturn XOpenDisplay ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n\r\nelse\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for dnet_ntoa in -ldnet\" >&5\r\n$as_echo_n \"checking for dnet_ntoa in -ldnet... \" >&6; }\r\nif ${ac_cv_lib_dnet_dnet_ntoa+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-ldnet $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar dnet_ntoa ();\r\nint\r\nmain ()\r\n{\r\nreturn dnet_ntoa ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_dnet_dnet_ntoa=yes\r\nelse\r\n ac_cv_lib_dnet_dnet_ntoa=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_dnet_dnet_ntoa\" >&5\r\n$as_echo \"$ac_cv_lib_dnet_dnet_ntoa\" >&6; }\r\nif test \"x$ac_cv_lib_dnet_dnet_ntoa\" = xyes; then :\r\n X_EXTRA_LIBS=\"$X_EXTRA_LIBS -ldnet\"\r\nfi\r\n\r\n if test $ac_cv_lib_dnet_dnet_ntoa = no; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for dnet_ntoa in -ldnet_stub\" >&5\r\n$as_echo_n \"checking for dnet_ntoa in -ldnet_stub... \" >&6; }\r\nif ${ac_cv_lib_dnet_stub_dnet_ntoa+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-ldnet_stub $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar dnet_ntoa ();\r\nint\r\nmain ()\r\n{\r\nreturn dnet_ntoa ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_dnet_stub_dnet_ntoa=yes\r\nelse\r\n ac_cv_lib_dnet_stub_dnet_ntoa=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_dnet_stub_dnet_ntoa\" >&5\r\n$as_echo \"$ac_cv_lib_dnet_stub_dnet_ntoa\" >&6; }\r\nif test \"x$ac_cv_lib_dnet_stub_dnet_ntoa\" = xyes; then :\r\n X_EXTRA_LIBS=\"$X_EXTRA_LIBS -ldnet_stub\"\r\nfi\r\n\r\n fi\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\n LIBS=\"$ac_xsave_LIBS\"\r\n\r\n # [email protected] says -lnsl (and -lsocket) are needed for his 386/AT,\r\n # to get the SysV transport functions.\r\n # Chad R. Larson says the Pyramis MIS-ES running DC/OSx (SVR4)\r\n # needs -lnsl.\r\n # The nsl library prevents programs from opening the X display\r\n # on Irix 5.2, according to T.E. Dickey.\r\n # The functions gethostbyname, getservbyname, and inet_addr are\r\n # in -lbsd on LynxOS 3.0.1/i386, according to Lars Hecking.\r\n ac_fn_cxx_check_func \"$LINENO\" \"gethostbyname\" \"ac_cv_func_gethostbyname\"\r\nif test \"x$ac_cv_func_gethostbyname\" = xyes; then :\r\n\r\nfi\r\n\r\n if test $ac_cv_func_gethostbyname = no; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for gethostbyname in -lnsl\" >&5\r\n$as_echo_n \"checking for gethostbyname in -lnsl... \" >&6; }\r\nif ${ac_cv_lib_nsl_gethostbyname+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-lnsl $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar gethostbyname ();\r\nint\r\nmain ()\r\n{\r\nreturn gethostbyname ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_nsl_gethostbyname=yes\r\nelse\r\n ac_cv_lib_nsl_gethostbyname=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_nsl_gethostbyname\" >&5\r\n$as_echo \"$ac_cv_lib_nsl_gethostbyname\" >&6; }\r\nif test \"x$ac_cv_lib_nsl_gethostbyname\" = xyes; then :\r\n X_EXTRA_LIBS=\"$X_EXTRA_LIBS -lnsl\"\r\nfi\r\n\r\n if test $ac_cv_lib_nsl_gethostbyname = no; then\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking for gethostbyname in -lbsd\" >&5\r\n$as_echo_n \"checking for gethostbyname in -lbsd... \" >&6; }\r\nif ${ac_cv_lib_bsd_gethostbyname+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-lbsd $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar gethostbyname ();\r\nint\r\nmain ()\r\n{\r\nreturn gethostbyname ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_bsd_gethostbyname=yes\r\nelse\r\n ac_cv_lib_bsd_gethostbyname=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_bsd_gethostbyname\" >&5\r\n$as_echo \"$ac_cv_lib_bsd_gethostbyname\" >&6; }\r\nif test \"x$ac_cv_lib_bsd_gethostbyname\" = xyes; then :\r\n X_EXTRA_LIBS=\"$X_EXTRA_LIBS -lbsd\"\r\nfi\r\n\r\n fi\r\n fi\r\n\r\n # [email protected] says without -lsocket,\r\n # socket/setsockopt and other routines are undefined under SCO ODT\r\n # 2.0. But -lsocket is broken on IRIX 5.2 (and is not necessary\r\n # on later versions), says Simon Leinen: it contains gethostby*\r\n # variants that don't use the name server (or something). -lsocket\r\n # must be given before -lnsl if both are needed. We assume that\r\n # if connect needs -lnsl, so does gethostbyname.\r\n ac_fn_cxx_check_func \"$LINENO\" \"connect\" \"ac_cv_func_connect\"\r\nif test \"x$ac_cv_func_connect\" = xyes; then :\r\n\r\nfi\r\n\r\n if test $ac_cv_func_connect = no; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for connect in -lsocket\" >&5\r\n$as_echo_n \"checking for connect in -lsocket... \" >&6; }\r\nif ${ac_cv_lib_socket_connect+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-lsocket $X_EXTRA_LIBS $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar connect ();\r\nint\r\nmain ()\r\n{\r\nreturn connect ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_socket_connect=yes\r\nelse\r\n ac_cv_lib_socket_connect=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_socket_connect\" >&5\r\n$as_echo \"$ac_cv_lib_socket_connect\" >&6; }\r\nif test \"x$ac_cv_lib_socket_connect\" = xyes; then :\r\n X_EXTRA_LIBS=\"-lsocket $X_EXTRA_LIBS\"\r\nfi\r\n\r\n fi\r\n\r\n # Guillermo Gomez says -lposix is necessary on A/UX.\r\n ac_fn_cxx_check_func \"$LINENO\" \"remove\" \"ac_cv_func_remove\"\r\nif test \"x$ac_cv_func_remove\" = xyes; then :\r\n\r\nfi\r\n\r\n if test $ac_cv_func_remove = no; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for remove in -lposix\" >&5\r\n$as_echo_n \"checking for remove in -lposix... \" >&6; }\r\nif ${ac_cv_lib_posix_remove+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-lposix $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar remove ();\r\nint\r\nmain ()\r\n{\r\nreturn remove ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_posix_remove=yes\r\nelse\r\n ac_cv_lib_posix_remove=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_posix_remove\" >&5\r\n$as_echo \"$ac_cv_lib_posix_remove\" >&6; }\r\nif test \"x$ac_cv_lib_posix_remove\" = xyes; then :\r\n X_EXTRA_LIBS=\"$X_EXTRA_LIBS -lposix\"\r\nfi\r\n\r\n fi\r\n\r\n # BSDI BSD/OS 2.1 needs -lipc for XOpenDisplay.\r\n ac_fn_cxx_check_func \"$LINENO\" \"shmat\" \"ac_cv_func_shmat\"\r\nif test \"x$ac_cv_func_shmat\" = xyes; then :\r\n\r\nfi\r\n\r\n if test $ac_cv_func_shmat = no; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for shmat in -lipc\" >&5\r\n$as_echo_n \"checking for shmat in -lipc... \" >&6; }\r\nif ${ac_cv_lib_ipc_shmat+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-lipc $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar shmat ();\r\nint\r\nmain ()\r\n{\r\nreturn shmat ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_ipc_shmat=yes\r\nelse\r\n ac_cv_lib_ipc_shmat=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_ipc_shmat\" >&5\r\n$as_echo \"$ac_cv_lib_ipc_shmat\" >&6; }\r\nif test \"x$ac_cv_lib_ipc_shmat\" = xyes; then :\r\n X_EXTRA_LIBS=\"$X_EXTRA_LIBS -lipc\"\r\nfi\r\n\r\n fi\r\n fi\r\n\r\n # Check for libraries that X11R6 Xt/Xaw programs need.\r\n ac_save_LDFLAGS=$LDFLAGS\r\n test -n \"$x_libraries\" && LDFLAGS=\"$LDFLAGS -L$x_libraries\"\r\n # SM needs ICE to (dynamically) link under SunOS 4.x (so we have to\r\n # check for ICE first), but we must link in the order -lSM -lICE or\r\n # we get undefined symbols. So assume we have SM if we have ICE.\r\n # These have to be linked with before -lX11, unlike the other\r\n # libraries we check for below, so use a different variable.\r\n # John Interrante, Karl Berry\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: checking for IceConnectionNumber in -lICE\" >&5\r\n$as_echo_n \"checking for IceConnectionNumber in -lICE... \" >&6; }\r\nif ${ac_cv_lib_ICE_IceConnectionNumber+:} false; then :\r\n $as_echo_n \"(cached) \" >&6\r\nelse\r\n ac_check_lib_save_LIBS=$LIBS\r\nLIBS=\"-lICE $X_EXTRA_LIBS $LIBS\"\r\ncat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n/* Override any GCC internal prototype to avoid an error.\r\n Use char because int might match the return type of a GCC\r\n builtin and then its argument prototype would still apply. */\r\n#ifdef __cplusplus\r\nextern \"C\"\r\n#endif\r\nchar IceConnectionNumber ();\r\nint\r\nmain ()\r\n{\r\nreturn IceConnectionNumber ();\r\n ;\r\n return 0;\r\n}\r\n_ACEOF\r\nif ac_fn_cxx_try_link \"$LINENO\"; then :\r\n ac_cv_lib_ICE_IceConnectionNumber=yes\r\nelse\r\n ac_cv_lib_ICE_IceConnectionNumber=no\r\nfi\r\nrm -f core conftest.err conftest.$ac_objext \\\r\n conftest$ac_exeext conftest.$ac_ext\r\nLIBS=$ac_check_lib_save_LIBS\r\nfi\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: $ac_cv_lib_ICE_IceConnectionNumber\" >&5\r\n$as_echo \"$ac_cv_lib_ICE_IceConnectionNumber\" >&6; }\r\nif test \"x$ac_cv_lib_ICE_IceConnectionNumber\" = xyes; then :\r\n X_PRE_LIBS=\"$X_PRE_LIBS -lSM -lICE\"\r\nfi\r\n\r\n LDFLAGS=$ac_save_LDFLAGS\r\n\r\nfi\r\n\r\nif test \"x$no_x\" = \"xyes\"; then\r\n\tas_fn_error $? \"X11 is required.\" \"$LINENO\" 5\r\nelse\r\n\tS9XFLGS=\"$S9XFLGS $X_CFLAGS\"\r\n\tS9XLIBS=\"$S9XLIBS $X_PRE_LIBS -lX11 -lXext $X_LIBS $X_EXTRA_LIBS\"\r\nfi\r\n\r\n# Check for headers\r\n\r\nsnes9x_have_stdint_h=\"\";\r\n\r\nac_fn_cxx_check_header_mongrel \"$LINENO\" \"strings.h\" \"ac_cv_header_strings_h\" \"$ac_includes_default\"\r\nif test \"x$ac_cv_header_strings_h\" = xyes; then :\r\n\r\n\tS9XDEFS=\"$S9XDEFS -DHAVE_STRINGS_H\"\r\n\r\nfi\r\n\r\n\r\n\r\nac_fn_cxx_check_header_mongrel \"$LINENO\" \"sys/ioctl.h\" \"ac_cv_header_sys_ioctl_h\" \"$ac_includes_default\"\r\nif test \"x$ac_cv_header_sys_ioctl_h\" = xyes; then :\r\n\r\n\tS9XDEFS=\"$S9XDEFS -DHAVE_SYS_IOCTL_H\"\r\n\r\nfi\r\n\r\n\r\n\r\nac_fn_cxx_check_header_mongrel \"$LINENO\" \"stdint.h\" \"ac_cv_header_stdint_h\" \"$ac_includes_default\"\r\nif test \"x$ac_cv_header_stdint_h\" = xyes; then :\r\n\r\n\tS9XDEFS=\"$S9XDEFS -DHAVE_STDINT_H\"\r\n\tsnes9x_have_stdint_h=\"-DHAVE_STDINT_H\"\r\n\r\nfi\r\n\r\n\r\n\r\nfor ac_header in unistd.h sys/socket.h\r\ndo :\r\n as_ac_Header=`$as_echo \"ac_cv_header_$ac_header\" | $as_tr_sh`\r\nac_fn_cxx_check_header_mongrel \"$LINENO\" \"$ac_header\" \"$as_ac_Header\" \"$ac_includes_default\"\r\nif eval test \\\"x\\$\"$as_ac_Header\"\\\" = x\"yes\"; then :\r\n cat >>confdefs.h <<_ACEOF\r\n#define `$as_echo \"HAVE_$ac_header\" | $as_tr_cpp` 1\r\n_ACEOF\r\n\r\nfi\r\n\r\ndone\r\n\r\n\r\n# Check whether the size of pointer is int.\r\n\r\nif test \"x$snes9x_have_stdint_h\" = \"x\"; then\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether the size of pointer is int\" >&5\r\n$as_echo_n \"checking whether the size of pointer is int... \" >&6; }\r\n\r\n\tif test \"$cross_compiling\" = yes; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\tint main (void)\r\n\t\t{\r\n\t\t\treturn (!(sizeof(void *) == sizeof(int)));\r\n\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_ptr_is_int=\"yes\"\r\nelse\r\n snes9x_ptr_is_int=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\n\tif test \"x$snes9x_ptr_is_int\" = \"xyes\"; then\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\t\tS9XDEFS=\"$S9XDEFS -DPTR_NOT_INT\"\r\n\tfi\r\nfi\r\n\r\n# Check whether right shift is arithmetic or not\r\n\r\n\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether right shift int8 is arithmetic\" >&5\r\n$as_echo_n \"checking whether right shift int8 is arithmetic... \" >&6; }\r\n\r\n\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS $snes9x_have_stdint_h\"\r\n\r\n\tif test \"$cross_compiling\" = no; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t#ifdef HAVE_STDINT_H\r\n\t\t#include <stdint.h>\r\n\t\ttypedef int8_t\t\t\tint8;\r\n\t\ttypedef int16_t\t\t\tint16;\r\n\t\ttypedef int32_t\t\t\tint32;\r\n\t\ttypedef int64_t\t\t\tint64;\r\n\t\t#else\r\n\t\ttypedef signed char\t\tint8;\r\n\t\ttypedef signed short\tint16;\r\n\t\ttypedef signed int\t\tint32;\r\n\t\t#ifdef __GNUC__\r\n\t\t__extension__\r\n\t\t#endif\r\n\t\ttypedef long long\t\tint64;\r\n\t\t#endif\r\n\r\n\t\tint main (void)\r\n\t\t{\r\n\t\t\tint8\ti;\r\n\r\n\t\t\ti = -1;\r\n\t\t\ti >>= 1;\r\n\r\n\t\t\treturn (i < 0 ? 0 : 1);\r\n\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_sar_int8=\"yes\"\r\nelse\r\n snes9x_sar_int8=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_sar_int8\" = \"xno\"; then\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\telse\r\n\t\tS9XDEFS=\"$S9XDEFS -DRIGHTSHIFT_int8_IS_SAR\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether right shift int16 is arithmetic\" >&5\r\n$as_echo_n \"checking whether right shift int16 is arithmetic... \" >&6; }\r\n\r\n\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS $snes9x_have_stdint_h\"\r\n\r\n\tif test \"$cross_compiling\" = no; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t#ifdef HAVE_STDINT_H\r\n\t\t#include <stdint.h>\r\n\t\ttypedef int8_t\t\t\tint8;\r\n\t\ttypedef int16_t\t\t\tint16;\r\n\t\ttypedef int32_t\t\t\tint32;\r\n\t\ttypedef int64_t\t\t\tint64;\r\n\t\t#else\r\n\t\ttypedef signed char\t\tint8;\r\n\t\ttypedef signed short\tint16;\r\n\t\ttypedef signed int\t\tint32;\r\n\t\t#ifdef __GNUC__\r\n\t\t__extension__\r\n\t\t#endif\r\n\t\ttypedef long long\t\tint64;\r\n\t\t#endif\r\n\r\n\t\tint main (void)\r\n\t\t{\r\n\t\t\tint16\ti;\r\n\r\n\t\t\ti = -1;\r\n\t\t\ti >>= 1;\r\n\r\n\t\t\treturn (i < 0 ? 0 : 1);\r\n\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_sar_int16=\"yes\"\r\nelse\r\n snes9x_sar_int16=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_sar_int16\" = \"xno\"; then\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\telse\r\n\t\tS9XDEFS=\"$S9XDEFS -DRIGHTSHIFT_int16_IS_SAR\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether right shift int32 is arithmetic\" >&5\r\n$as_echo_n \"checking whether right shift int32 is arithmetic... \" >&6; }\r\n\r\n\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS $snes9x_have_stdint_h\"\r\n\r\n\tif test \"$cross_compiling\" = no; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t#ifdef HAVE_STDINT_H\r\n\t\t#include <stdint.h>\r\n\t\ttypedef int8_t\t\t\tint8;\r\n\t\ttypedef int16_t\t\t\tint16;\r\n\t\ttypedef int32_t\t\t\tint32;\r\n\t\ttypedef int64_t\t\t\tint64;\r\n\t\t#else\r\n\t\ttypedef signed char\t\tint8;\r\n\t\ttypedef signed short\tint16;\r\n\t\ttypedef signed int\t\tint32;\r\n\t\t#ifdef __GNUC__\r\n\t\t__extension__\r\n\t\t#endif\r\n\t\ttypedef long long\t\tint64;\r\n\t\t#endif\r\n\r\n\t\tint main (void)\r\n\t\t{\r\n\t\t\tint32\ti;\r\n\r\n\t\t\ti = -1;\r\n\t\t\ti >>= 1;\r\n\r\n\t\t\treturn (i < 0 ? 0 : 1);\r\n\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_sar_int32=\"yes\"\r\nelse\r\n snes9x_sar_int32=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_sar_int32\" = \"xno\"; then\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\telse\r\n\t\tS9XDEFS=\"$S9XDEFS -DRIGHTSHIFT_int32_IS_SAR\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\tfi\r\n\r\n\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether right shift int64 is arithmetic\" >&5\r\n$as_echo_n \"checking whether right shift int64 is arithmetic... \" >&6; }\r\n\r\n\tOLD_CXXFLAGS=\"$CXXFLAGS\"\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS $snes9x_have_stdint_h\"\r\n\r\n\tif test \"$cross_compiling\" = no; then :\r\n { { $as_echo \"$as_me:${as_lineno-$LINENO}: error: in \\`$ac_pwd':\" >&5\r\n$as_echo \"$as_me: error: in \\`$ac_pwd':\" >&2;}\r\nas_fn_error $? \"cannot run test program while cross compiling\r\nSee \\`config.log' for more details\" \"$LINENO\" 5; }\r\nelse\r\n cat confdefs.h - <<_ACEOF >conftest.$ac_ext\r\n/* end confdefs.h. */\r\n\r\n\t\t#ifdef HAVE_STDINT_H\r\n\t\t#include <stdint.h>\r\n\t\ttypedef int8_t\t\t\tint8;\r\n\t\ttypedef int16_t\t\t\tint16;\r\n\t\ttypedef int32_t\t\t\tint32;\r\n\t\ttypedef int64_t\t\t\tint64;\r\n\t\t#else\r\n\t\ttypedef signed char\t\tint8;\r\n\t\ttypedef signed short\tint16;\r\n\t\ttypedef signed int\t\tint32;\r\n\t\t#ifdef __GNUC__\r\n\t\t__extension__\r\n\t\t#endif\r\n\t\ttypedef long long\t\tint64;\r\n\t\t#endif\r\n\r\n\t\tint main (void)\r\n\t\t{\r\n\t\t\tint64\ti;\r\n\r\n\t\t\ti = -1;\r\n\t\t\ti >>= 1;\r\n\r\n\t\t\treturn (i < 0 ? 0 : 1);\r\n\t\t}\r\n\r\n_ACEOF\r\nif ac_fn_cxx_try_run \"$LINENO\"; then :\r\n snes9x_sar_int64=\"yes\"\r\nelse\r\n snes9x_sar_int64=\"no\"\r\nfi\r\nrm -f core *.core core.conftest.* gmon.out bb.out conftest$ac_exeext \\\r\n conftest.$ac_objext conftest.beam conftest.$ac_ext\r\nfi\r\n\r\n\r\n\tCXXFLAGS=\"$OLD_CXXFLAGS\"\r\n\r\n\tif test \"x$snes9x_sar_int64\" = \"xno\"; then\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\telse\r\n\t\tS9XDEFS=\"$S9XDEFS -DRIGHTSHIFT_int64_IS_SAR\"\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\tfi\r\n\r\n\r\nif test \"x$snes9x_sar_int8\" = \"xyes\" -a \"x$snes9x_sar_int16\" = \"xyes\" -a \"x$snes9x_sar_int32\" = \"xyes\" -a \"x$snes9x_sar_int64\" = \"xyes\"; then\r\n\tS9XDEFS=\"`echo \\\"$S9XDEFS\\\" | sed -e 's/-DRIGHTSHIFT_int8_IS_SAR//'`\"\r\n\tS9XDEFS=\"`echo \\\"$S9XDEFS\\\" | sed -e 's/-DRIGHTSHIFT_int16_IS_SAR//'`\"\r\n\tS9XDEFS=\"`echo \\\"$S9XDEFS\\\" | sed -e 's/-DRIGHTSHIFT_int32_IS_SAR//'`\"\r\n\tS9XDEFS=\"`echo \\\"$S9XDEFS\\\" | sed -e 's/-DRIGHTSHIFT_int64_IS_SAR//'`\"\r\n\tS9XDEFS=\"$S9XDEFS -DRIGHTSHIFT_IS_SAR\"\r\nfi\r\n\r\n# Check if we can build with Xvideo acceleration support\r\n# Check whether --enable-xvideo was given.\r\nif test \"${enable_xvideo+set}\" = set; then :\r\n enableval=$enable_xvideo;\r\nelse\r\n enable_xvideo=\"yes\"\r\nfi\r\n\r\n\r\nif test \"x$enable_xvideo\" = \"xyes\"; then\r\n\tenable_xvideo=\"no\"\r\n\tac_fn_cxx_check_header_mongrel \"$LINENO\" \"X11/extensions/Xv.h\" \"ac_cv_header_X11_extensions_Xv_h\" \"$ac_includes_default\"\r\nif test \"x$ac_cv_header_X11_extensions_Xv_h\" = xyes; then :\r\n\r\n\t\tenable_xvideo=\"yes\"\r\n\t\tS9XLIBS=\"$S9XLIBS -lXv\"\r\n\t\tS9XDEFS=\"$S9XDEFS -DUSE_XVIDEO\"\r\n\r\nfi\r\n\r\n\r\nfi\r\n\r\n# Check if we can build with Xinerama multi-monitor support\r\n# Check whether --enable-xinerama was given.\r\nif test \"${enable_xinerama+set}\" = set; then :\r\n enableval=$enable_xinerama;\r\nelse\r\n enable_xinerama=\"yes\"\r\nfi\r\n\r\n\r\nif test \"x$enable_xinerama\" = \"xyes\"; then\r\n\tenable_xinerama=\"no\"\r\n\tac_fn_cxx_check_header_mongrel \"$LINENO\" \"X11/extensions/Xinerama.h\" \"ac_cv_header_X11_extensions_Xinerama_h\" \"$ac_includes_default\"\r\nif test \"x$ac_cv_header_X11_extensions_Xinerama_h\" = xyes; then :\r\n\r\n\t\tenable_xinerama=\"yes\"\r\n\t\tS9XLIBS=\"$S9XLIBS -lXinerama\"\r\n\t\tS9XDEFS=\"$S9XDEFS -DUSE_XINERAMA\"\r\n\r\nfi\r\n\r\n\r\nfi\r\n\r\n# Check if we have sound code for this platform.\r\n\r\n# Check whether --enable-sound was given.\r\nif test \"${enable_sound+set}\" = set; then :\r\n enableval=$enable_sound;\r\nelse\r\n enable_sound=\"yes\"\r\nfi\r\n\r\n\r\nif test \"x$enable_sound\" = \"xyes\"; then\r\n\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: checking whether sound is supported on this platform\" >&5\r\n$as_echo_n \"checking whether sound is supported on this platform... \" >&6; }\r\n\tif test \"x$snes9x_cv_linux_os\" = \"xyes\"; then\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: yes\" >&5\r\n$as_echo \"yes\" >&6; }\r\n\telse\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: result: no\" >&5\r\n$as_echo \"no\" >&6; }\r\n\t\t{ $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: Your OS is not Linux. Build without sound support.\" >&5\r\n$as_echo \"$as_me: WARNING: Your OS is not Linux. Build without sound support.\" >&2;}\r\n\t\tenable_sound=\"no\"\r\n\tfi\r\nfi\r\n\r\nif test \"x$enable_sound\" = \"xyes\"; then\r\n\tac_fn_cxx_check_header_mongrel \"$LINENO\" \"pthread.h\" \"ac_cv_header_pthread_h\" \"$ac_includes_default\"\r\nif test \"x$ac_cv_header_pthread_h\" = xyes; then :\r\n\r\n\t\tS9XDEFS=\"$S9XDEFS -DUSE_THREADS\"\r\n\t\tS9XLIBS=\"$S9XLIBS -lpthread\"\r\n\r\nfi\r\n\r\n\r\nelse\r\n\tS9XDEFS=\"$S9XDEFS -DNOSOUND\"\r\nfi\r\n\r\n# Output.\r\n\r\nS9XFLGS=\"$CXXFLAGS $CPPFLAGS $LDFLAGS $S9XFLGS\"\r\nS9XLIBS=\"$LIBS $S9XLIBS\"\r\n\r\nS9XFLGS=\"`echo \\\"$S9XFLGS\\\" | sed -e 's/ */ /g'`\"\r\nS9XDEFS=\"`echo \\\"$S9XDEFS\\\" | sed -e 's/ */ /g'`\"\r\nS9XLIBS=\"`echo \\\"$S9XLIBS\\\" | sed -e 's/ */ /g'`\"\r\nS9X_SYSTEM_ZIP=\"`echo \\\"$S9X_SYSTEM_ZIP\\\" | sed -e 's/ */ /g'`\"\r\nS9XFLGS=\"`echo \\\"$S9XFLGS\\\" | sed -e 's/^ *//'`\"\r\nS9XDEFS=\"`echo \\\"$S9XDEFS\\\" | sed -e 's/^ *//'`\"\r\nS9XLIBS=\"`echo \\\"$S9XLIBS\\\" | sed -e 's/^ *//'`\"\r\nS9X_SYSTEM_ZIP=\"`echo \\\"$S9X_SYSTEM_ZIP\\\" | sed -e 's/^ *//'`\"\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\n\r\nrm config.info 2>/dev/null\r\n\r\ncat >config.info <<EOF\r\n\r\nbuild information:\r\ncc...............,,,. $CC\r\nc++.................. $CXX\r\noptions.............. $S9XFLGS\r\ndefines.............. $S9XDEFS\r\nlibs................. $S9XLIBS\r\n\r\nfeatures:\r\nXvideo support....... $enable_xvideo\r\nXinerama support..... $enable_xinerama\r\nsound support........ $enable_sound\r\nscreenshot support... $enable_screenshot\r\nnetplay support...... $enable_netplay\r\ngamepad support...... $enable_gamepad\r\nGZIP support......... $enable_gzip\r\nZIP support.......... $enable_zip\r\nSYSTEM_ZIP........... $with_system_zip\r\nJMA support.......... $enable_jma\r\nSSE4.1............... $enable_sse41\r\nAVX2................. $enable_avx2\r\nNEON................. $enable_neon\r\ndebugger............. $enable_debugger\r\n\r\nEOF\r\n\r\ncat config.info\r\n\r\nac_config_files=\"$ac_config_files Makefile\"\r\n\r\ncat >confcache <<\\_ACEOF\r\n# This file is a shell script that caches the results of configure\r\n# tests run on this system so they can be shared between configure\r\n# scripts and configure runs, see configure's option --config-cache.\r\n# It is not useful on other systems. If it contains results you don't\r\n# want to keep, you may remove or edit it.\r\n#\r\n# config.status only pays attention to the cache file if you give it\r\n# the --recheck option to rerun configure.\r\n#\r\n# `ac_cv_env_foo' variables (set or unset) will be overridden when\r\n# loading this file, other *unset* `ac_cv_foo' will be assigned the\r\n# following values.\r\n\r\n_ACEOF\r\n\r\n# The following way of writing the cache mishandles newlines in values,\r\n# but we know of no workaround that is simple, portable, and efficient.\r\n# So, we kill variables containing newlines.\r\n# Ultrix sh set writes to stderr and can't be redirected directly,\r\n# and sets the high bit in the cache file unless we assign to the vars.\r\n(\r\n for ac_var in `(set) 2>&1 | sed -n 's/^\\([a-zA-Z_][a-zA-Z0-9_]*\\)=.*/\\1/p'`; do\r\n eval ac_val=\\$$ac_var\r\n case $ac_val in #(\r\n *${as_nl}*)\r\n case $ac_var in #(\r\n *_cv_*) { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: cache variable $ac_var contains a newline\" >&5\r\n$as_echo \"$as_me: WARNING: cache variable $ac_var contains a newline\" >&2;} ;;\r\n esac\r\n case $ac_var in #(\r\n _ | IFS | as_nl) ;; #(\r\n BASH_ARGV | BASH_SOURCE) eval $ac_var= ;; #(\r\n *) { eval $ac_var=; unset $ac_var;} ;;\r\n esac ;;\r\n esac\r\n done\r\n\r\n (set) 2>&1 |\r\n case $as_nl`(ac_space=' '; set) 2>&1` in #(\r\n *${as_nl}ac_space=\\ *)\r\n # `set' does not quote correctly, so add quotes: double-quote\r\n # substitution turns \\\\\\\\ into \\\\, and sed turns \\\\ into \\.\r\n sed -n \\\r\n\t\"s/'/'\\\\\\\\''/g;\r\n\t s/^\\\\([_$as_cr_alnum]*_cv_[_$as_cr_alnum]*\\\\)=\\\\(.*\\\\)/\\\\1='\\\\2'/p\"\r\n ;; #(\r\n *)\r\n # `set' quotes correctly as required by POSIX, so do not add quotes.\r\n sed -n \"/^[_$as_cr_alnum]*_cv_[_$as_cr_alnum]*=/p\"\r\n ;;\r\n esac |\r\n sort\r\n) |\r\n sed '\r\n /^ac_cv_env_/b end\r\n t clear\r\n :clear\r\n s/^\\([^=]*\\)=\\(.*[{}].*\\)$/test \"${\\1+set}\" = set || &/\r\n t end\r\n s/^\\([^=]*\\)=\\(.*\\)$/\\1=${\\1=\\2}/\r\n :end' >>confcache\r\nif diff \"$cache_file\" confcache >/dev/null 2>&1; then :; else\r\n if test -w \"$cache_file\"; then\r\n if test \"x$cache_file\" != \"x/dev/null\"; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: updating cache $cache_file\" >&5\r\n$as_echo \"$as_me: updating cache $cache_file\" >&6;}\r\n if test ! -f \"$cache_file\" || test -h \"$cache_file\"; then\r\n\tcat confcache >\"$cache_file\"\r\n else\r\n case $cache_file in #(\r\n */* | ?:*)\r\n\t mv -f confcache \"$cache_file\"$$ &&\r\n\t mv -f \"$cache_file\"$$ \"$cache_file\" ;; #(\r\n *)\r\n\t mv -f confcache \"$cache_file\" ;;\r\n\tesac\r\n fi\r\n fi\r\n else\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: not updating unwritable cache $cache_file\" >&5\r\n$as_echo \"$as_me: not updating unwritable cache $cache_file\" >&6;}\r\n fi\r\nfi\r\nrm -f confcache\r\n\r\ntest \"x$prefix\" = xNONE && prefix=$ac_default_prefix\r\n# Let make expand exec_prefix.\r\ntest \"x$exec_prefix\" = xNONE && exec_prefix='${prefix}'\r\n\r\n# Transform confdefs.h into DEFS.\r\n# Protect against shell expansion while executing Makefile rules.\r\n# Protect against Makefile macro expansion.\r\n#\r\n# If the first sed substitution is executed (which looks for macros that\r\n# take arguments), then branch to the quote section. Otherwise,\r\n# look for a macro that doesn't take arguments.\r\nac_script='\r\n:mline\r\n/\\\\$/{\r\n N\r\n s,\\\\\\n,,\r\n b mline\r\n}\r\nt clear\r\n:clear\r\ns/^[\t ]*#[\t ]*define[\t ][\t ]*\\([^\t (][^\t (]*([^)]*)\\)[\t ]*\\(.*\\)/-D\\1=\\2/g\r\nt quote\r\ns/^[\t ]*#[\t ]*define[\t ][\t ]*\\([^\t ][^\t ]*\\)[\t ]*\\(.*\\)/-D\\1=\\2/g\r\nt quote\r\nb any\r\n:quote\r\ns/[\t `~#$^&*(){}\\\\|;'\\''\"<>?]/\\\\&/g\r\ns/\\[/\\\\&/g\r\ns/\\]/\\\\&/g\r\ns/\\$/$$/g\r\nH\r\n:any\r\n${\r\n\tg\r\n\ts/^\\n//\r\n\ts/\\n/ /g\r\n\tp\r\n}\r\n'\r\nDEFS=`sed -n \"$ac_script\" confdefs.h`\r\n\r\n\r\nac_libobjs=\r\nac_ltlibobjs=\r\nU=\r\nfor ac_i in : $LIBOBJS; do test \"x$ac_i\" = x: && continue\r\n # 1. Remove the extension, and $U if already installed.\r\n ac_script='s/\\$U\\././;s/\\.o$//;s/\\.obj$//'\r\n ac_i=`$as_echo \"$ac_i\" | sed \"$ac_script\"`\r\n # 2. Prepend LIBOBJDIR. When used with automake>=1.10 LIBOBJDIR\r\n # will be set to the directory where LIBOBJS objects are built.\r\n as_fn_append ac_libobjs \" \\${LIBOBJDIR}$ac_i\\$U.$ac_objext\"\r\n as_fn_append ac_ltlibobjs \" \\${LIBOBJDIR}$ac_i\"'$U.lo'\r\ndone\r\nLIBOBJS=$ac_libobjs\r\n\r\nLTLIBOBJS=$ac_ltlibobjs\r\n\r\n\r\n\r\n: \"${CONFIG_STATUS=./config.status}\"\r\nac_write_fail=0\r\nac_clean_files_save=$ac_clean_files\r\nac_clean_files=\"$ac_clean_files $CONFIG_STATUS\"\r\n{ $as_echo \"$as_me:${as_lineno-$LINENO}: creating $CONFIG_STATUS\" >&5\r\n$as_echo \"$as_me: creating $CONFIG_STATUS\" >&6;}\r\nas_write_fail=0\r\ncat >$CONFIG_STATUS <<_ASEOF || as_write_fail=1\r\n#! $SHELL\r\n# Generated by $as_me.\r\n# Run this file to recreate the current configuration.\r\n# Compiler output produced by configure, useful for debugging\r\n# configure, is in config.log if it exists.\r\n\r\ndebug=false\r\nac_cs_recheck=false\r\nac_cs_silent=false\r\n\r\nSHELL=\\${CONFIG_SHELL-$SHELL}\r\nexport SHELL\r\n_ASEOF\r\ncat >>$CONFIG_STATUS <<\\_ASEOF || as_write_fail=1\r\n## -------------------- ##\r\n## M4sh Initialization. ##\r\n## -------------------- ##\r\n\r\n# Be more Bourne compatible\r\nDUALCASE=1; export DUALCASE # for MKS sh\r\nif test -n \"${ZSH_VERSION+set}\" && (emulate sh) >/dev/null 2>&1; then :\r\n emulate sh\r\n NULLCMD=:\r\n # Pre-4.2 versions of Zsh do word splitting on ${1+\"$@\"}, which\r\n # is contrary to our usage. Disable this feature.\r\n alias -g '${1+\"$@\"}'='\"$@\"'\r\n setopt NO_GLOB_SUBST\r\nelse\r\n case `(set -o) 2>/dev/null` in #(\r\n *posix*) :\r\n set -o posix ;; #(\r\n *) :\r\n ;;\r\nesac\r\nfi\r\n\r\n\r\nas_nl='\r\n'\r\nexport as_nl\r\n# Printing a long string crashes Solaris 7 /usr/bin/printf.\r\nas_echo='\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\\'\r\nas_echo=$as_echo$as_echo$as_echo$as_echo$as_echo\r\nas_echo=$as_echo$as_echo$as_echo$as_echo$as_echo$as_echo\r\n# Prefer a ksh shell builtin over an external printf program on Solaris,\r\n# but without wasting forks for bash or zsh.\r\nif test -z \"$BASH_VERSION$ZSH_VERSION\" \\\r\n && (test \"X`print -r -- $as_echo`\" = \"X$as_echo\") 2>/dev/null; then\r\n as_echo='print -r --'\r\n as_echo_n='print -rn --'\r\nelif (test \"X`printf %s $as_echo`\" = \"X$as_echo\") 2>/dev/null; then\r\n as_echo='printf %s\\n'\r\n as_echo_n='printf %s'\r\nelse\r\n if test \"X`(/usr/ucb/echo -n -n $as_echo) 2>/dev/null`\" = \"X-n $as_echo\"; then\r\n as_echo_body='eval /usr/ucb/echo -n \"$1$as_nl\"'\r\n as_echo_n='/usr/ucb/echo -n'\r\n else\r\n as_echo_body='eval expr \"X$1\" : \"X\\\\(.*\\\\)\"'\r\n as_echo_n_body='eval\r\n arg=$1;\r\n case $arg in #(\r\n *\"$as_nl\"*)\r\n\texpr \"X$arg\" : \"X\\\\(.*\\\\)$as_nl\";\r\n\targ=`expr \"X$arg\" : \".*$as_nl\\\\(.*\\\\)\"`;;\r\n esac;\r\n expr \"X$arg\" : \"X\\\\(.*\\\\)\" | tr -d \"$as_nl\"\r\n '\r\n export as_echo_n_body\r\n as_echo_n='sh -c $as_echo_n_body as_echo'\r\n fi\r\n export as_echo_body\r\n as_echo='sh -c $as_echo_body as_echo'\r\nfi\r\n\r\n# The user is always right.\r\nif test \"${PATH_SEPARATOR+set}\" != set; then\r\n PATH_SEPARATOR=:\r\n (PATH='/bin;/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 && {\r\n (PATH='/bin:/bin'; FPATH=$PATH; sh -c :) >/dev/null 2>&1 ||\r\n PATH_SEPARATOR=';'\r\n }\r\nfi\r\n\r\n\r\n# IFS\r\n# We need space, tab and new line, in precisely that order. Quoting is\r\n# there to prevent editors from complaining about space-tab.\r\n# (If _AS_PATH_WALK were called with IFS unset, it would disable word\r\n# splitting by setting IFS to empty value.)\r\nIFS=\" \"\"\t$as_nl\"\r\n\r\n# Find who we are. Look in the path if we contain no directory separator.\r\nas_myself=\r\ncase $0 in #((\r\n *[\\\\/]* ) as_myself=$0 ;;\r\n *) as_save_IFS=$IFS; IFS=$PATH_SEPARATOR\r\nfor as_dir in $PATH\r\ndo\r\n IFS=$as_save_IFS\r\n test -z \"$as_dir\" && as_dir=.\r\n test -r \"$as_dir/$0\" && as_myself=$as_dir/$0 && break\r\n done\r\nIFS=$as_save_IFS\r\n\r\n ;;\r\nesac\r\n# We did not find ourselves, most probably we were run as `sh COMMAND'\r\n# in which case we are not to be found in the path.\r\nif test \"x$as_myself\" = x; then\r\n as_myself=$0\r\nfi\r\nif test ! -f \"$as_myself\"; then\r\n $as_echo \"$as_myself: error: cannot find myself; rerun with an absolute file name\" >&2\r\n exit 1\r\nfi\r\n\r\n# Unset variables that we do not need and which cause bugs (e.g. in\r\n# pre-3.0 UWIN ksh). But do not cause bugs in bash 2.01; the \"|| exit 1\"\r\n# suppresses any \"Segmentation fault\" message there. '((' could\r\n# trigger a bug in pdksh 5.2.14.\r\nfor as_var in BASH_ENV ENV MAIL MAILPATH\r\ndo eval test x\\${$as_var+set} = xset \\\r\n && ( (unset $as_var) || exit 1) >/dev/null 2>&1 && unset $as_var || :\r\ndone\r\nPS1='$ '\r\nPS2='> '\r\nPS4='+ '\r\n\r\n# NLS nuisances.\r\nLC_ALL=C\r\nexport LC_ALL\r\nLANGUAGE=C\r\nexport LANGUAGE\r\n\r\n# CDPATH.\r\n(unset CDPATH) >/dev/null 2>&1 && unset CDPATH\r\n\r\n\r\n# as_fn_error STATUS ERROR [LINENO LOG_FD]\r\n# ----------------------------------------\r\n# Output \"`basename $0`: error: ERROR\" to stderr. If LINENO and LOG_FD are\r\n# provided, also output the error to LOG_FD, referencing LINENO. Then exit the\r\n# script with STATUS, using 1 if that was 0.\r\nas_fn_error ()\r\n{\r\n as_status=$1; test $as_status -eq 0 && as_status=1\r\n if test \"$4\"; then\r\n as_lineno=${as_lineno-\"$3\"} as_lineno_stack=as_lineno_stack=$as_lineno_stack\r\n $as_echo \"$as_me:${as_lineno-$LINENO}: error: $2\" >&$4\r\n fi\r\n $as_echo \"$as_me: error: $2\" >&2\r\n as_fn_exit $as_status\r\n} # as_fn_error\r\n\r\n\r\n# as_fn_set_status STATUS\r\n# -----------------------\r\n# Set $? to STATUS, without forking.\r\nas_fn_set_status ()\r\n{\r\n return $1\r\n} # as_fn_set_status\r\n\r\n# as_fn_exit STATUS\r\n# -----------------\r\n# Exit the shell with STATUS, even in a \"trap 0\" or \"set -e\" context.\r\nas_fn_exit ()\r\n{\r\n set +e\r\n as_fn_set_status $1\r\n exit $1\r\n} # as_fn_exit\r\n\r\n# as_fn_unset VAR\r\n# ---------------\r\n# Portably unset VAR.\r\nas_fn_unset ()\r\n{\r\n { eval $1=; unset $1;}\r\n}\r\nas_unset=as_fn_unset\r\n# as_fn_append VAR VALUE\r\n# ----------------------\r\n# Append the text in VALUE to the end of the definition contained in VAR. Take\r\n# advantage of any shell optimizations that allow amortized linear growth over\r\n# repeated appends, instead of the typical quadratic growth present in naive\r\n# implementations.\r\nif (eval \"as_var=1; as_var+=2; test x\\$as_var = x12\") 2>/dev/null; then :\r\n eval 'as_fn_append ()\r\n {\r\n eval $1+=\\$2\r\n }'\r\nelse\r\n as_fn_append ()\r\n {\r\n eval $1=\\$$1\\$2\r\n }\r\nfi # as_fn_append\r\n\r\n# as_fn_arith ARG...\r\n# ------------------\r\n# Perform arithmetic evaluation on the ARGs, and store the result in the\r\n# global $as_val. Take advantage of shells that can avoid forks. The arguments\r\n# must be portable across $(()) and expr.\r\nif (eval \"test \\$(( 1 + 1 )) = 2\") 2>/dev/null; then :\r\n eval 'as_fn_arith ()\r\n {\r\n as_val=$(( $* ))\r\n }'\r\nelse\r\n as_fn_arith ()\r\n {\r\n as_val=`expr \"$@\" || test $? -eq 1`\r\n }\r\nfi # as_fn_arith\r\n\r\n\r\nif expr a : '\\(a\\)' >/dev/null 2>&1 &&\r\n test \"X`expr 00001 : '.*\\(...\\)'`\" = X001; then\r\n as_expr=expr\r\nelse\r\n as_expr=false\r\nfi\r\n\r\nif (basename -- /) >/dev/null 2>&1 && test \"X`basename -- / 2>&1`\" = \"X/\"; then\r\n as_basename=basename\r\nelse\r\n as_basename=false\r\nfi\r\n\r\nif (as_dir=`dirname -- /` && test \"X$as_dir\" = X/) >/dev/null 2>&1; then\r\n as_dirname=dirname\r\nelse\r\n as_dirname=false\r\nfi\r\n\r\nas_me=`$as_basename -- \"$0\" ||\r\n$as_expr X/\"$0\" : '.*/\\([^/][^/]*\\)/*$' \\| \\\r\n\t X\"$0\" : 'X\\(//\\)$' \\| \\\r\n\t X\"$0\" : 'X\\(/\\)' \\| . 2>/dev/null ||\r\n$as_echo X/\"$0\" |\r\n sed '/^.*\\/\\([^/][^/]*\\)\\/*$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\/\\(\\/\\/\\)$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\/\\(\\/\\).*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t s/.*/./; q'`\r\n\r\n# Avoid depending upon Character Ranges.\r\nas_cr_letters='abcdefghijklmnopqrstuvwxyz'\r\nas_cr_LETTERS='ABCDEFGHIJKLMNOPQRSTUVWXYZ'\r\nas_cr_Letters=$as_cr_letters$as_cr_LETTERS\r\nas_cr_digits='0123456789'\r\nas_cr_alnum=$as_cr_Letters$as_cr_digits\r\n\r\nECHO_C= ECHO_N= ECHO_T=\r\ncase `echo -n x` in #(((((\r\n-n*)\r\n case `echo 'xy\\c'` in\r\n *c*) ECHO_T='\t';;\t# ECHO_T is single tab character.\r\n xy) ECHO_C='\\c';;\r\n *) echo `echo ksh88 bug on AIX 6.1` > /dev/null\r\n ECHO_T='\t';;\r\n esac;;\r\n*)\r\n ECHO_N='-n';;\r\nesac\r\n\r\nrm -f conf$$ conf$$.exe conf$$.file\r\nif test -d conf$$.dir; then\r\n rm -f conf$$.dir/conf$$.file\r\nelse\r\n rm -f conf$$.dir\r\n mkdir conf$$.dir 2>/dev/null\r\nfi\r\nif (echo >conf$$.file) 2>/dev/null; then\r\n if ln -s conf$$.file conf$$ 2>/dev/null; then\r\n as_ln_s='ln -s'\r\n # ... but there are two gotchas:\r\n # 1) On MSYS, both `ln -s file dir' and `ln file dir' fail.\r\n # 2) DJGPP < 2.04 has no symlinks; `ln -s' creates a wrapper executable.\r\n # In both cases, we have to default to `cp -pR'.\r\n ln -s conf$$.file conf$$.dir 2>/dev/null && test ! -f conf$$.exe ||\r\n as_ln_s='cp -pR'\r\n elif ln conf$$.file conf$$ 2>/dev/null; then\r\n as_ln_s=ln\r\n else\r\n as_ln_s='cp -pR'\r\n fi\r\nelse\r\n as_ln_s='cp -pR'\r\nfi\r\nrm -f conf$$ conf$$.exe conf$$.dir/conf$$.file conf$$.file\r\nrmdir conf$$.dir 2>/dev/null\r\n\r\n\r\n# as_fn_mkdir_p\r\n# -------------\r\n# Create \"$as_dir\" as a directory, including parents if necessary.\r\nas_fn_mkdir_p ()\r\n{\r\n\r\n case $as_dir in #(\r\n -*) as_dir=./$as_dir;;\r\n esac\r\n test -d \"$as_dir\" || eval $as_mkdir_p || {\r\n as_dirs=\r\n while :; do\r\n case $as_dir in #(\r\n *\\'*) as_qdir=`$as_echo \"$as_dir\" | sed \"s/'/'\\\\\\\\\\\\\\\\''/g\"`;; #'(\r\n *) as_qdir=$as_dir;;\r\n esac\r\n as_dirs=\"'$as_qdir' $as_dirs\"\r\n as_dir=`$as_dirname -- \"$as_dir\" ||\r\n$as_expr X\"$as_dir\" : 'X\\(.*[^/]\\)//*[^/][^/]*/*$' \\| \\\r\n\t X\"$as_dir\" : 'X\\(//\\)[^/]' \\| \\\r\n\t X\"$as_dir\" : 'X\\(//\\)$' \\| \\\r\n\t X\"$as_dir\" : 'X\\(/\\)' \\| . 2>/dev/null ||\r\n$as_echo X\"$as_dir\" |\r\n sed '/^X\\(.*[^/]\\)\\/\\/*[^/][^/]*\\/*$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\/\\)[^/].*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\/\\)$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\).*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t s/.*/./; q'`\r\n test -d \"$as_dir\" && break\r\n done\r\n test -z \"$as_dirs\" || eval \"mkdir $as_dirs\"\r\n } || test -d \"$as_dir\" || as_fn_error $? \"cannot create directory $as_dir\"\r\n\r\n\r\n} # as_fn_mkdir_p\r\nif mkdir -p . 2>/dev/null; then\r\n as_mkdir_p='mkdir -p \"$as_dir\"'\r\nelse\r\n test -d ./-p && rmdir ./-p\r\n as_mkdir_p=false\r\nfi\r\n\r\n\r\n# as_fn_executable_p FILE\r\n# -----------------------\r\n# Test if FILE is an executable regular file.\r\nas_fn_executable_p ()\r\n{\r\n test -f \"$1\" && test -x \"$1\"\r\n} # as_fn_executable_p\r\nas_test_x='test -x'\r\nas_executable_p=as_fn_executable_p\r\n\r\n# Sed expression to map a string onto a valid CPP name.\r\nas_tr_cpp=\"eval sed 'y%*$as_cr_letters%P$as_cr_LETTERS%;s%[^_$as_cr_alnum]%_%g'\"\r\n\r\n# Sed expression to map a string onto a valid variable name.\r\nas_tr_sh=\"eval sed 'y%*+%pp%;s%[^_$as_cr_alnum]%_%g'\"\r\n\r\n\r\nexec 6>&1\r\n## ----------------------------------- ##\r\n## Main body of $CONFIG_STATUS script. ##\r\n## ----------------------------------- ##\r\n_ASEOF\r\ntest $as_write_fail = 0 && chmod +x $CONFIG_STATUS || ac_write_fail=1\r\n\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\n# Save the log message, to keep $0 and so on meaningful, and to\r\n# report actual input values of CONFIG_FILES etc. instead of their\r\n# values after options handling.\r\nac_log=\"\r\nThis file was extended by Snes9x $as_me 1.60, which was\r\ngenerated by GNU Autoconf 2.69. Invocation command line was\r\n\r\n CONFIG_FILES = $CONFIG_FILES\r\n CONFIG_HEADERS = $CONFIG_HEADERS\r\n CONFIG_LINKS = $CONFIG_LINKS\r\n CONFIG_COMMANDS = $CONFIG_COMMANDS\r\n $ $0 $@\r\n\r\non `(hostname || uname -n) 2>/dev/null | sed 1q`\r\n\"\r\n\r\n_ACEOF\r\n\r\ncase $ac_config_files in *\"\r\n\"*) set x $ac_config_files; shift; ac_config_files=$*;;\r\nesac\r\n\r\n\r\n\r\ncat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1\r\n# Files that config.status was made for.\r\nconfig_files=\"$ac_config_files\"\r\n\r\n_ACEOF\r\n\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\nac_cs_usage=\"\\\r\n\\`$as_me' instantiates files and other configuration actions\r\nfrom templates according to the current configuration. Unless the files\r\nand actions are specified as TAGs, all are instantiated by default.\r\n\r\nUsage: $0 [OPTION]... [TAG]...\r\n\r\n -h, --help print this help, then exit\r\n -V, --version print version number and configuration settings, then exit\r\n --config print configuration, then exit\r\n -q, --quiet, --silent\r\n do not print progress messages\r\n -d, --debug don't remove temporary files\r\n --recheck update $as_me by reconfiguring in the same conditions\r\n --file=FILE[:TEMPLATE]\r\n instantiate the configuration file FILE\r\n\r\nConfiguration files:\r\n$config_files\r\n\r\nReport bugs to the package provider.\"\r\n\r\n_ACEOF\r\ncat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1\r\nac_cs_config=\"`$as_echo \"$ac_configure_args\" | sed 's/^ //; s/[\\\\\"\"\\`\\$]/\\\\\\\\&/g'`\"\r\nac_cs_version=\"\\\\\r\nSnes9x config.status 1.60\r\nconfigured by $0, generated by GNU Autoconf 2.69,\r\n with options \\\\\"\\$ac_cs_config\\\\\"\r\n\r\nCopyright (C) 2012 Free Software Foundation, Inc.\r\nThis config.status script is free software; the Free Software Foundation\r\ngives unlimited permission to copy, distribute and modify it.\"\r\n\r\nac_pwd='$ac_pwd'\r\nsrcdir='$srcdir'\r\ntest -n \"\\$AWK\" || AWK=awk\r\n_ACEOF\r\n\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\n# The default lists apply if the user does not specify any file.\r\nac_need_defaults=:\r\nwhile test $# != 0\r\ndo\r\n case $1 in\r\n --*=?*)\r\n ac_option=`expr \"X$1\" : 'X\\([^=]*\\)='`\r\n ac_optarg=`expr \"X$1\" : 'X[^=]*=\\(.*\\)'`\r\n ac_shift=:\r\n ;;\r\n --*=)\r\n ac_option=`expr \"X$1\" : 'X\\([^=]*\\)='`\r\n ac_optarg=\r\n ac_shift=:\r\n ;;\r\n *)\r\n ac_option=$1\r\n ac_optarg=$2\r\n ac_shift=shift\r\n ;;\r\n esac\r\n\r\n case $ac_option in\r\n # Handling of the options.\r\n -recheck | --recheck | --rechec | --reche | --rech | --rec | --re | --r)\r\n ac_cs_recheck=: ;;\r\n --version | --versio | --versi | --vers | --ver | --ve | --v | -V )\r\n $as_echo \"$ac_cs_version\"; exit ;;\r\n --config | --confi | --conf | --con | --co | --c )\r\n $as_echo \"$ac_cs_config\"; exit ;;\r\n --debug | --debu | --deb | --de | --d | -d )\r\n debug=: ;;\r\n --file | --fil | --fi | --f )\r\n $ac_shift\r\n case $ac_optarg in\r\n *\\'*) ac_optarg=`$as_echo \"$ac_optarg\" | sed \"s/'/'\\\\\\\\\\\\\\\\''/g\"` ;;\r\n '') as_fn_error $? \"missing file argument\" ;;\r\n esac\r\n as_fn_append CONFIG_FILES \" '$ac_optarg'\"\r\n ac_need_defaults=false;;\r\n --he | --h | --help | --hel | -h )\r\n $as_echo \"$ac_cs_usage\"; exit ;;\r\n -q | -quiet | --quiet | --quie | --qui | --qu | --q \\\r\n | -silent | --silent | --silen | --sile | --sil | --si | --s)\r\n ac_cs_silent=: ;;\r\n\r\n # This is an error.\r\n -*) as_fn_error $? \"unrecognized option: \\`$1'\r\nTry \\`$0 --help' for more information.\" ;;\r\n\r\n *) as_fn_append ac_config_targets \" $1\"\r\n ac_need_defaults=false ;;\r\n\r\n esac\r\n shift\r\ndone\r\n\r\nac_configure_extra_args=\r\n\r\nif $ac_cs_silent; then\r\n exec 6>/dev/null\r\n ac_configure_extra_args=\"$ac_configure_extra_args --silent\"\r\nfi\r\n\r\n_ACEOF\r\ncat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1\r\nif \\$ac_cs_recheck; then\r\n set X $SHELL '$0' $ac_configure_args \\$ac_configure_extra_args --no-create --no-recursion\r\n shift\r\n \\$as_echo \"running CONFIG_SHELL=$SHELL \\$*\" >&6\r\n CONFIG_SHELL='$SHELL'\r\n export CONFIG_SHELL\r\n exec \"\\$@\"\r\nfi\r\n\r\n_ACEOF\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\nexec 5>>config.log\r\n{\r\n echo\r\n sed 'h;s/./-/g;s/^.../## /;s/...$/ ##/;p;x;p;x' <<_ASBOX\r\n## Running $as_me. ##\r\n_ASBOX\r\n $as_echo \"$ac_log\"\r\n} >&5\r\n\r\n_ACEOF\r\ncat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1\r\n_ACEOF\r\n\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\n\r\n# Handling of arguments.\r\nfor ac_config_target in $ac_config_targets\r\ndo\r\n case $ac_config_target in\r\n \"Makefile\") CONFIG_FILES=\"$CONFIG_FILES Makefile\" ;;\r\n\r\n *) as_fn_error $? \"invalid argument: \\`$ac_config_target'\" \"$LINENO\" 5;;\r\n esac\r\ndone\r\n\r\n\r\n# If the user did not use the arguments to specify the items to instantiate,\r\n# then the envvar interface is used. Set only those that are not.\r\n# We use the long form for the default assignment because of an extremely\r\n# bizarre bug on SunOS 4.1.3.\r\nif $ac_need_defaults; then\r\n test \"${CONFIG_FILES+set}\" = set || CONFIG_FILES=$config_files\r\nfi\r\n\r\n# Have a temporary directory for convenience. Make it in the build tree\r\n# simply because there is no reason against having it here, and in addition,\r\n# creating and moving files from /tmp can sometimes cause problems.\r\n# Hook for its removal unless debugging.\r\n# Note that there is a small window in which the directory will not be cleaned:\r\n# after its creation but before its name has been assigned to `$tmp'.\r\n$debug ||\r\n{\r\n tmp= ac_tmp=\r\n trap 'exit_status=$?\r\n : \"${ac_tmp:=$tmp}\"\r\n { test ! -d \"$ac_tmp\" || rm -fr \"$ac_tmp\"; } && exit $exit_status\r\n' 0\r\n trap 'as_fn_exit 1' 1 2 13 15\r\n}\r\n# Create a (secure) tmp directory for tmp files.\r\n\r\n{\r\n tmp=`(umask 077 && mktemp -d \"./confXXXXXX\") 2>/dev/null` &&\r\n test -d \"$tmp\"\r\n} ||\r\n{\r\n tmp=./conf$$-$RANDOM\r\n (umask 077 && mkdir \"$tmp\")\r\n} || as_fn_error $? \"cannot create a temporary directory in .\" \"$LINENO\" 5\r\nac_tmp=$tmp\r\n\r\n# Set up the scripts for CONFIG_FILES section.\r\n# No need to generate them if there are no CONFIG_FILES.\r\n# This happens for instance with `./config.status config.h'.\r\nif test -n \"$CONFIG_FILES\"; then\r\n\r\n\r\nac_cr=`echo X | tr X '\\015'`\r\n# On cygwin, bash can eat \\r inside `` if the user requested igncr.\r\n# But we know of no other shell where ac_cr would be empty at this\r\n# point, so we can use a bashism as a fallback.\r\nif test \"x$ac_cr\" = x; then\r\n eval ac_cr=\\$\\'\\\\r\\'\r\nfi\r\nac_cs_awk_cr=`$AWK 'BEGIN { print \"a\\rb\" }' </dev/null 2>/dev/null`\r\nif test \"$ac_cs_awk_cr\" = \"a${ac_cr}b\"; then\r\n ac_cs_awk_cr='\\\\r'\r\nelse\r\n ac_cs_awk_cr=$ac_cr\r\nfi\r\n\r\necho 'BEGIN {' >\"$ac_tmp/subs1.awk\" &&\r\n_ACEOF\r\n\r\n\r\n{\r\n echo \"cat >conf$$subs.awk <<_ACEOF\" &&\r\n echo \"$ac_subst_vars\" | sed 's/.*/&!$&$ac_delim/' &&\r\n echo \"_ACEOF\"\r\n} >conf$$subs.sh ||\r\n as_fn_error $? \"could not make $CONFIG_STATUS\" \"$LINENO\" 5\r\nac_delim_num=`echo \"$ac_subst_vars\" | grep -c '^'`\r\nac_delim='%!_!# '\r\nfor ac_last_try in false false false false false :; do\r\n . ./conf$$subs.sh ||\r\n as_fn_error $? \"could not make $CONFIG_STATUS\" \"$LINENO\" 5\r\n\r\n ac_delim_n=`sed -n \"s/.*$ac_delim\\$/X/p\" conf$$subs.awk | grep -c X`\r\n if test $ac_delim_n = $ac_delim_num; then\r\n break\r\n elif $ac_last_try; then\r\n as_fn_error $? \"could not make $CONFIG_STATUS\" \"$LINENO\" 5\r\n else\r\n ac_delim=\"$ac_delim!$ac_delim _$ac_delim!! \"\r\n fi\r\ndone\r\nrm -f conf$$subs.sh\r\n\r\ncat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1\r\ncat >>\"\\$ac_tmp/subs1.awk\" <<\\\\_ACAWK &&\r\n_ACEOF\r\nsed -n '\r\nh\r\ns/^/S[\"/; s/!.*/\"]=/\r\np\r\ng\r\ns/^[^!]*!//\r\n:repl\r\nt repl\r\ns/'\"$ac_delim\"'$//\r\nt delim\r\n:nl\r\nh\r\ns/\\(.\\{148\\}\\)..*/\\1/\r\nt more1\r\ns/[\"\\\\]/\\\\&/g; s/^/\"/; s/$/\\\\n\"\\\\/\r\np\r\nn\r\nb repl\r\n:more1\r\ns/[\"\\\\]/\\\\&/g; s/^/\"/; s/$/\"\\\\/\r\np\r\ng\r\ns/.\\{148\\}//\r\nt nl\r\n:delim\r\nh\r\ns/\\(.\\{148\\}\\)..*/\\1/\r\nt more2\r\ns/[\"\\\\]/\\\\&/g; s/^/\"/; s/$/\"/\r\np\r\nb\r\n:more2\r\ns/[\"\\\\]/\\\\&/g; s/^/\"/; s/$/\"\\\\/\r\np\r\ng\r\ns/.\\{148\\}//\r\nt delim\r\n' <conf$$subs.awk | sed '\r\n/^[^\"\"]/{\r\n N\r\n s/\\n//\r\n}\r\n' >>$CONFIG_STATUS || ac_write_fail=1\r\nrm -f conf$$subs.awk\r\ncat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1\r\n_ACAWK\r\ncat >>\"\\$ac_tmp/subs1.awk\" <<_ACAWK &&\r\n for (key in S) S_is_set[key] = 1\r\n FS = \"\u0007\"\r\n\r\n}\r\n{\r\n line = $ 0\r\n nfields = split(line, field, \"@\")\r\n substed = 0\r\n len = length(field[1])\r\n for (i = 2; i < nfields; i++) {\r\n key = field[i]\r\n keylen = length(key)\r\n if (S_is_set[key]) {\r\n value = S[key]\r\n line = substr(line, 1, len) \"\" value \"\" substr(line, len + keylen + 3)\r\n len += length(value) + length(field[++i])\r\n substed = 1\r\n } else\r\n len += 1 + keylen\r\n }\r\n\r\n print line\r\n}\r\n\r\n_ACAWK\r\n_ACEOF\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\nif sed \"s/$ac_cr//\" < /dev/null > /dev/null 2>&1; then\r\n sed \"s/$ac_cr\\$//; s/$ac_cr/$ac_cs_awk_cr/g\"\r\nelse\r\n cat\r\nfi < \"$ac_tmp/subs1.awk\" > \"$ac_tmp/subs.awk\" \\\r\n || as_fn_error $? \"could not setup config files machinery\" \"$LINENO\" 5\r\n_ACEOF\r\n\r\n# VPATH may cause trouble with some makes, so we remove sole $(srcdir),\r\n# ${srcdir} and @srcdir@ entries from VPATH if srcdir is \".\", strip leading and\r\n# trailing colons and then remove the whole line if VPATH becomes empty\r\n# (actually we leave an empty line to preserve line numbers).\r\nif test \"x$srcdir\" = x.; then\r\n ac_vpsub='/^[\t ]*VPATH[\t ]*=[\t ]*/{\r\nh\r\ns///\r\ns/^/:/\r\ns/[\t ]*$/:/\r\ns/:\\$(srcdir):/:/g\r\ns/:\\${srcdir}:/:/g\r\ns/:@srcdir@:/:/g\r\ns/^:*//\r\ns/:*$//\r\nx\r\ns/\\(=[\t ]*\\).*/\\1/\r\nG\r\ns/\\n//\r\ns/^[^=]*=[\t ]*$//\r\n}'\r\nfi\r\n\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\nfi # test -n \"$CONFIG_FILES\"\r\n\r\n\r\neval set X \" :F $CONFIG_FILES \"\r\nshift\r\nfor ac_tag\r\ndo\r\n case $ac_tag in\r\n :[FHLC]) ac_mode=$ac_tag; continue;;\r\n esac\r\n case $ac_mode$ac_tag in\r\n :[FHL]*:*);;\r\n :L* | :C*:*) as_fn_error $? \"invalid tag \\`$ac_tag'\" \"$LINENO\" 5;;\r\n :[FH]-) ac_tag=-:-;;\r\n :[FH]*) ac_tag=$ac_tag:$ac_tag.in;;\r\n esac\r\n ac_save_IFS=$IFS\r\n IFS=:\r\n set x $ac_tag\r\n IFS=$ac_save_IFS\r\n shift\r\n ac_file=$1\r\n shift\r\n\r\n case $ac_mode in\r\n :L) ac_source=$1;;\r\n :[FH])\r\n ac_file_inputs=\r\n for ac_f\r\n do\r\n case $ac_f in\r\n -) ac_f=\"$ac_tmp/stdin\";;\r\n *) # Look for the file first in the build tree, then in the source tree\r\n\t # (if the path is not absolute). The absolute path cannot be DOS-style,\r\n\t # because $ac_f cannot contain `:'.\r\n\t test -f \"$ac_f\" ||\r\n\t case $ac_f in\r\n\t [\\\\/$]*) false;;\r\n\t *) test -f \"$srcdir/$ac_f\" && ac_f=\"$srcdir/$ac_f\";;\r\n\t esac ||\r\n\t as_fn_error 1 \"cannot find input file: \\`$ac_f'\" \"$LINENO\" 5;;\r\n esac\r\n case $ac_f in *\\'*) ac_f=`$as_echo \"$ac_f\" | sed \"s/'/'\\\\\\\\\\\\\\\\''/g\"`;; esac\r\n as_fn_append ac_file_inputs \" '$ac_f'\"\r\n done\r\n\r\n # Let's still pretend it is `configure' which instantiates (i.e., don't\r\n # use $as_me), people would be surprised to read:\r\n # /* config.h. Generated by config.status. */\r\n configure_input='Generated from '`\r\n\t $as_echo \"$*\" | sed 's|^[^:]*/||;s|:[^:]*/|, |g'\r\n\t`' by configure.'\r\n if test x\"$ac_file\" != x-; then\r\n configure_input=\"$ac_file. $configure_input\"\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: creating $ac_file\" >&5\r\n$as_echo \"$as_me: creating $ac_file\" >&6;}\r\n fi\r\n # Neutralize special characters interpreted by sed in replacement strings.\r\n case $configure_input in #(\r\n *\\&* | *\\|* | *\\\\* )\r\n ac_sed_conf_input=`$as_echo \"$configure_input\" |\r\n sed 's/[\\\\\\\\&|]/\\\\\\\\&/g'`;; #(\r\n *) ac_sed_conf_input=$configure_input;;\r\n esac\r\n\r\n case $ac_tag in\r\n *:-:* | *:-) cat >\"$ac_tmp/stdin\" \\\r\n || as_fn_error $? \"could not create $ac_file\" \"$LINENO\" 5 ;;\r\n esac\r\n ;;\r\n esac\r\n\r\n ac_dir=`$as_dirname -- \"$ac_file\" ||\r\n$as_expr X\"$ac_file\" : 'X\\(.*[^/]\\)//*[^/][^/]*/*$' \\| \\\r\n\t X\"$ac_file\" : 'X\\(//\\)[^/]' \\| \\\r\n\t X\"$ac_file\" : 'X\\(//\\)$' \\| \\\r\n\t X\"$ac_file\" : 'X\\(/\\)' \\| . 2>/dev/null ||\r\n$as_echo X\"$ac_file\" |\r\n sed '/^X\\(.*[^/]\\)\\/\\/*[^/][^/]*\\/*$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\/\\)[^/].*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\/\\)$/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t /^X\\(\\/\\).*/{\r\n\t s//\\1/\r\n\t q\r\n\t }\r\n\t s/.*/./; q'`\r\n as_dir=\"$ac_dir\"; as_fn_mkdir_p\r\n ac_builddir=.\r\n\r\ncase \"$ac_dir\" in\r\n.) ac_dir_suffix= ac_top_builddir_sub=. ac_top_build_prefix= ;;\r\n*)\r\n ac_dir_suffix=/`$as_echo \"$ac_dir\" | sed 's|^\\.[\\\\/]||'`\r\n # A \"..\" for each directory in $ac_dir_suffix.\r\n ac_top_builddir_sub=`$as_echo \"$ac_dir_suffix\" | sed 's|/[^\\\\/]*|/..|g;s|/||'`\r\n case $ac_top_builddir_sub in\r\n \"\") ac_top_builddir_sub=. ac_top_build_prefix= ;;\r\n *) ac_top_build_prefix=$ac_top_builddir_sub/ ;;\r\n esac ;;\r\nesac\r\nac_abs_top_builddir=$ac_pwd\r\nac_abs_builddir=$ac_pwd$ac_dir_suffix\r\n# for backward compatibility:\r\nac_top_builddir=$ac_top_build_prefix\r\n\r\ncase $srcdir in\r\n .) # We are building in place.\r\n ac_srcdir=.\r\n ac_top_srcdir=$ac_top_builddir_sub\r\n ac_abs_top_srcdir=$ac_pwd ;;\r\n [\\\\/]* | ?:[\\\\/]* ) # Absolute name.\r\n ac_srcdir=$srcdir$ac_dir_suffix;\r\n ac_top_srcdir=$srcdir\r\n ac_abs_top_srcdir=$srcdir ;;\r\n *) # Relative name.\r\n ac_srcdir=$ac_top_build_prefix$srcdir$ac_dir_suffix\r\n ac_top_srcdir=$ac_top_build_prefix$srcdir\r\n ac_abs_top_srcdir=$ac_pwd/$srcdir ;;\r\nesac\r\nac_abs_srcdir=$ac_abs_top_srcdir$ac_dir_suffix\r\n\r\n\r\n case $ac_mode in\r\n :F)\r\n #\r\n # CONFIG_FILE\r\n #\r\n\r\n_ACEOF\r\n\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\n# If the template does not know about datarootdir, expand it.\r\n# FIXME: This hack should be removed a few years after 2.60.\r\nac_datarootdir_hack=; ac_datarootdir_seen=\r\nac_sed_dataroot='\r\n/datarootdir/ {\r\n p\r\n q\r\n}\r\n/@datadir@/p\r\n/@docdir@/p\r\n/@infodir@/p\r\n/@localedir@/p\r\n/@mandir@/p'\r\ncase `eval \"sed -n \\\"\\$ac_sed_dataroot\\\" $ac_file_inputs\"` in\r\n*datarootdir*) ac_datarootdir_seen=yes;;\r\n*@datadir@*|*@docdir@*|*@infodir@*|*@localedir@*|*@mandir@*)\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $ac_file_inputs seems to ignore the --datarootdir setting\" >&5\r\n$as_echo \"$as_me: WARNING: $ac_file_inputs seems to ignore the --datarootdir setting\" >&2;}\r\n_ACEOF\r\ncat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1\r\n ac_datarootdir_hack='\r\n s&@datadir@&$datadir&g\r\n s&@docdir@&$docdir&g\r\n s&@infodir@&$infodir&g\r\n s&@localedir@&$localedir&g\r\n s&@mandir@&$mandir&g\r\n s&\\\\\\${datarootdir}&$datarootdir&g' ;;\r\nesac\r\n_ACEOF\r\n\r\n# Neutralize VPATH when `$srcdir' = `.'.\r\n# Shell code in configure.ac might set extrasub.\r\n# FIXME: do we really want to maintain this feature?\r\ncat >>$CONFIG_STATUS <<_ACEOF || ac_write_fail=1\r\nac_sed_extra=\"$ac_vpsub\r\n$extrasub\r\n_ACEOF\r\ncat >>$CONFIG_STATUS <<\\_ACEOF || ac_write_fail=1\r\n:t\r\n/@[a-zA-Z_][a-zA-Z_0-9]*@/!b\r\ns|@configure_input@|$ac_sed_conf_input|;t t\r\ns&@top_builddir@&$ac_top_builddir_sub&;t t\r\ns&@top_build_prefix@&$ac_top_build_prefix&;t t\r\ns&@srcdir@&$ac_srcdir&;t t\r\ns&@abs_srcdir@&$ac_abs_srcdir&;t t\r\ns&@top_srcdir@&$ac_top_srcdir&;t t\r\ns&@abs_top_srcdir@&$ac_abs_top_srcdir&;t t\r\ns&@builddir@&$ac_builddir&;t t\r\ns&@abs_builddir@&$ac_abs_builddir&;t t\r\ns&@abs_top_builddir@&$ac_abs_top_builddir&;t t\r\n$ac_datarootdir_hack\r\n\"\r\neval sed \\\"\\$ac_sed_extra\\\" \"$ac_file_inputs\" | $AWK -f \"$ac_tmp/subs.awk\" \\\r\n >$ac_tmp/out || as_fn_error $? \"could not create $ac_file\" \"$LINENO\" 5\r\n\r\ntest -z \"$ac_datarootdir_hack$ac_datarootdir_seen\" &&\r\n { ac_out=`sed -n '/\\${datarootdir}/p' \"$ac_tmp/out\"`; test -n \"$ac_out\"; } &&\r\n { ac_out=`sed -n '/^[\t ]*datarootdir[\t ]*:*=/p' \\\r\n \"$ac_tmp/out\"`; test -z \"$ac_out\"; } &&\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: $ac_file contains a reference to the variable \\`datarootdir'\r\nwhich seems to be undefined. Please make sure it is defined\" >&5\r\n$as_echo \"$as_me: WARNING: $ac_file contains a reference to the variable \\`datarootdir'\r\nwhich seems to be undefined. Please make sure it is defined\" >&2;}\r\n\r\n rm -f \"$ac_tmp/stdin\"\r\n case $ac_file in\r\n -) cat \"$ac_tmp/out\" && rm -f \"$ac_tmp/out\";;\r\n *) rm -f \"$ac_file\" && mv \"$ac_tmp/out\" \"$ac_file\";;\r\n esac \\\r\n || as_fn_error $? \"could not create $ac_file\" \"$LINENO\" 5\r\n ;;\r\n\r\n\r\n\r\n esac\r\n\r\ndone # for ac_tag\r\n\r\n\r\nas_fn_exit 0\r\n_ACEOF\r\nac_clean_files=$ac_clean_files_save\r\n\r\ntest $ac_write_fail = 0 ||\r\n as_fn_error $? \"write failure creating $CONFIG_STATUS\" \"$LINENO\" 5\r\n\r\n\r\n# configure is writing to config.log, and then calls config.status.\r\n# config.status does its own redirection, appending to config.log.\r\n# Unfortunately, on DOS this fails, as config.log is still kept open\r\n# by configure, so config.status won't be able to write to it; its\r\n# output is simply discarded. So we exec the FD to /dev/null,\r\n# effectively closing config.log, so it can be properly (re)opened and\r\n# appended to by config.status. When coming back to configure, we\r\n# need to make the FD available again.\r\nif test \"$no_create\" != yes; then\r\n ac_cs_success=:\r\n ac_config_status_args=\r\n test \"$silent\" = yes &&\r\n ac_config_status_args=\"$ac_config_status_args --quiet\"\r\n exec 5>/dev/null\r\n $SHELL $CONFIG_STATUS $ac_config_status_args || ac_cs_success=false\r\n exec 5>>config.log\r\n # Use ||, not &&, to avoid exiting from the if with $? = 1, which\r\n # would make configure fail if this is the last instruction.\r\n $ac_cs_success || as_fn_exit 1\r\nfi\r\nif test -n \"$ac_unrecognized_opts\" && test \"$enable_option_checking\" != no; then\r\n { $as_echo \"$as_me:${as_lineno-$LINENO}: WARNING: unrecognized options: $ac_unrecognized_opts\" >&5\r\n$as_echo \"$as_me: WARNING: unrecognized options: $ac_unrecognized_opts\" >&2;}\r\nfi\r\n\r\n\r\nAt the same time I also modified the environment variables. Because I added it to buildroot to compile it, I wrote the environment variables that need to be modified into the MK file. MK file type is as follows\r\n\r\nSNES9X_VERSION = 0.1.4\r\nSNES9X_SITE_METHOD = local\r\nSNES9X_SITE = $(TOPDIR)/../external/snes9x\r\nSNES9X_SUBDIR = unix\r\n\r\n#SNES9X_INSTALL_STAGING = YES\r\n\r\n# Not compatible with GCC 6 which defaults to GNU++14\r\n#SNES9X_CONF_ENV += CXXFLAGS=\"$(TARGET_CXXFLAGS) -std=gnu++98\"\r\n\r\nSNES9X_CONF_ENV = snes9x_cv_option_o3+=no \\\r\n\tsnes9x_cv_option_o2+=no \\\r\n\tsnes9x_cv_option_o1+=no \\\r\n\tsnes9x_cv_option_omit_frame_pointer+=no \\\r\n\tsnes9x_cv_option_mcpu+=no \\\r\n\tsnes9x_cv_option_no_exceptions+=no \\\r\n\tsnes9x_cv_option_no_rtti+=no \\\r\n\tsnes9x_cv_option_Wall+=no \\\r\n\tsnes9x_cv_option_W+=no \\\r\n\tsnes9x_cv_option_Wno_unused_parameter+=no \\\r\n\tsnes9x_cv_option_sse41+=no \\\r\n\tsnes9x_have_stdint_h=no=no \\\r\n\tsnes9x_cv_option_pedantic+=no \\\r\n\tsnes9x_cv_option_Wno_unused_parameter+=no \\\r\n\tsnes9x_cv_option_avx2+=no \\\r\n\tsnes9x_cv_option_neon+=no \\\r\n\tsnes9x_ptr_is_int=no \\\r\n\tsnes9x_sar_int8=no \\\r\n\tsnes9x_sar_int16=no \\\r\n\tsnes9x_sar_int32=no \\\r\n\tac_cv_func_gethostbyname=no\r\n\r\n define SNES9X_INSTALL_TARGET_CMDS\r\n $(INSTALL) -D -m 755 $(@D)/unix/snes9x $(TARGET_DIR)/usr/bin/snes9x\r\n endef\r\n$(eval $(autotools-package))\r\n\r\nI have compiled this and I have got the executable snes9x. I copied the executable file and related dependencies to the target board. (snes9x and all since the X11 library)\r\nA rom file is also prepared. Running on the development board but reporting an error.\r\n\r\n[root@arm-linux]#./bin/snes9x /mnt/sdcard/Neil_bin/jtbw.smc \r\n\r\n\r\nSnes9x 1.60 for unix\r\n[ 195.341970] DSP power on\r\n/mnt/sdcard/Neil_bin/jtbw.smc 0 rom_filename=/mnt/sdcard/Neil_bin/jtbw.smc fragment size: 0\r\n[ 195.342679] ##dsp##: DSP is ready to work, firmware version: CVR_V2.3.4_20180202\r\nPort 1: Pad #1. Port 2: <none>. \r\nNeil main 1500 Neil main 1546 Found ROM file header (and ignored it).\r\nMap_LoROMMap\r\n\"Street Fighter 2\" [checksum ok] LoROM, 16Mbits, ROM, PAL, SRAM:0Kbits, ID:n 5, CRC32:8B93F566\r\njoystick: No joystick found.\r\nFailed to connect to X server.\r\n[ 196.169430] DSP power off\r\n[root@arm-linux]#\r\n\r\n This is all the information I am running on the development board.\r\n I found a fatal error through the source code. In the X11.cpp file. \u00a0\u00a0 But I don't find out why this is the case. My development board device has a display device\r\n\r\nvoid S9xInitDisplay (int argc, char **argv)\r\n{\r\n\tprintf (\"Neil\t%s %d \\n\",__func__,__LINE__);\r\n\r\n\tGUI.display = XOpenDisplay(NULL);\r\n\tif (GUI.display == NULL)\r\n\t\tFatalError(\"Failed to connect to X server.\");\r\n\r\n\r\n[root@arm-linux]#ls /dev/fb*\r\n/dev/fb0 /dev/fb1 /dev/fb2\r\n[root@arm-linux]#\r\n\r\nAt this point I don't know how to continue. I hope to have the help of friends who know this problem. thank you very much.\r\nForgive me for being so bad in English, I hope you can understand what I mean and give a response.\r\n\r\nthanks\uff01\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Installation instructions reference the wrong git repository. ","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Deprecated imports and non-working examples for embedded Qt console\n\nHi,\r\n\r\nwhen trying out the examples mentioned in the documentation\r\n[Embedding the QtConsole in a Qt application](https://github.com/jupyter/qtconsole/blob/master/docs/source/index.rst#embedding-the-qtconsole-in-a-qt-application), I noticed that the embedded IPython kernel example uses deprecated or defunct import locations:\r\n(https://github.com/ipython/ipykernel/blob/master/examples/embedding/ipkernel_qtapp.py)\r\n I could not figure out how to make that example work.\r\n\r\nWhat did found working instead was using the IPython.embed_kernel() method as described in (closed) issue #197. For that method however, the documentation is very sparse, i.e. where to specify the connection file for the embedded kernel I had to find out from the closed issue example code.\r\n\r\n\r\n\r\n\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Is there a README to introduce the tool ?","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Build / install problems (maybe lack of documentation?)\n\nHi,\r\n\r\nI'd really like to try using this with my Starbook S (firmware 2.7) and Sky Chart, but I've reached the limit of my knowledge... What would be _fantastic_ would be if you were able to build an installer for the driver to add to the repo (for me, Windows 10 64-bit, but I realise there'll be other requirements).\r\n\r\nAnyway, I downloaded all the build tool chain and bits and pieces, and I can successfully build the VS project. But I'm stuck after that. What should I do next to install it? I tried the ASCOM Driver Install Script Generator, and ran it on both the `\\bin\\Release` and `\\obj\\Release` folders. But the script it generates first of all fails a version check with Inno Setup Compiler (min version 6.0 now), and then fails on this line:\r\n```\r\n P := CreateOleObject('ASCOM.Utilities.Profile');\r\n```\r\nI don't see a dll by that name in either output folder, so not sure what to try next... Any pointers or more detailed build / install steps much appreciated...","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Google Presentations","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# setNumThreads not honored\n\n<!--\r\nIf you have a question rather than reporting a bug please go to http://answers.opencv.org where you get much faster responses.\r\nIf you need further assistance please read [How To Contribute](https://github.com/opencv/opencv/wiki/How_to_contribute).\r\n\r\nPlease:\r\n\r\n* Read the documentation to test with the latest developer build.\r\n* Check if other person has already created the same issue to avoid duplicates. You can comment on it if there already is an issue.\r\n* Try to be as detailed as possible in your report.\r\n* Report only one problem per created issue.\r\n\r\n\r\nThis is a template helping you to create an issue which can be processed as quickly as possible. This is the bug reporting section for the OpenCV library.\r\n-->\r\n\r\n##### System information (version)\r\n<!-- Example\r\n- OpenCV => 3.1\r\n- Operating System / Platform => Windows 64 Bit\r\n- Compiler => Visual Studio 2015\r\n-->\r\n\r\n- OpenCV => 4.1.0\r\n- Operating System / Platform => Linux 4.4.0\r\n- Compiler => See below\r\n\r\n##### Detailed description\r\n\r\nOn a multi-core system, `setNumThreads` doesn't seem to be taken into account. No matter what I try, my code and dependencies will spawn N threads whenever I call `cv2` functions, where N is the number of cores on the machine.\r\n\r\nWe're already taking care of parallelism by creating as many processes as we have cores (python code), so we don't need OpenCV to spawn threads on top of that in the background, it only creates more CPU contention.\r\n\r\n##### Steps to reproduce\r\n\r\n```\r\n>>> import cv2\r\n>>> cv2.setNumThreads(1)\r\n>>> c = cv2.VideoCapture('/path/to/video/file')\r\n```\r\n\r\nAt this point a `ps -eLf` in another terminal will show as many python threads as there are cores on the machine, instead of only 1 as per `setNumThreads(1)`.\r\n\r\nSo for instance on a machine with 64 cores, creating 64 processes (e.g. with `multiprocessing.Process`) will create 4096 threads in the background. Definitely not what we want. Creating only 1 process and 64 threads results in very poor performance as we have higher level work we need to parallelize.\r\n\r\nNB: I've tried `export OMP_NUM_THREADS=1`, it doesn't seem to help either.\r\n\r\n\r\n---\r\n\r\n** SYSTEM AND BUILD INFO (this is a Deep Learning AMI on EC2)**\r\n\r\n```bash\r\n$ python\r\nPython 3.6.6 |Anaconda, Inc.| (default, Jun 28 2018, 17:14:51) \r\n[GCC 7.2.0] on linux\r\nType \"help\", \"copyright\", \"credits\" or \"license\" for more information.\r\n>>> import cv2\r\n>>> print(cv2.getBuildInformation())\r\n\r\nGeneral configuration for OpenCV 4.1.0 =====================================\r\n Version control: 4.1.0\r\n\r\n Extra modules:\r\n Location (extra): /io/opencv_contrib/modules\r\n Version control (extra): 4.1.0\r\n\r\n Platform:\r\n Timestamp: 2019-04-11T17:07:54Z\r\n Host: Linux 4.4.0-101-generic x86_64\r\n CMake: 3.9.0\r\n CMake generator: Unix Makefiles\r\n CMake build tool: /usr/bin/gmake\r\n Configuration: Release\r\n\r\n CPU/HW features:\r\n Baseline: SSE SSE2 SSE3\r\n requested: SSE3\r\n Dispatched code generation: SSE4_1 SSE4_2 FP16 AVX AVX2\r\n requested: SSE4_1 SSE4_2 AVX FP16 AVX2 AVX512_SKX\r\n SSE4_1 (13 files): + SSSE3 SSE4_1\r\n SSE4_2 (1 files): + SSSE3 SSE4_1 POPCNT SSE4_2\r\n FP16 (0 files): + SSSE3 SSE4_1 POPCNT SSE4_2 FP16 AVX\r\n AVX (4 files): + SSSE3 SSE4_1 POPCNT SSE4_2 AVX\r\n AVX2 (27 files): + SSSE3 SSE4_1 POPCNT SSE4_2 FP16 FMA3 AVX AVX2\r\n\r\n C/C++:\r\n Built as dynamic libs?: NO\r\n C++ Compiler: /usr/lib/ccache/compilers/c++ (ver 4.8.2)\r\n C++ flags (Release): -Wl,-strip-all -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wsign-promo -Wuninitialized -Winit-self -Wno-delete-non-virtual-dtor -Wno-comment -Wno-missing-field-initializers -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -msse -msse2 -msse3 -fvisibility=hidden -fvisibility-inlines-hidden -O3 -DNDEBUG -DNDEBUG\r\n C++ flags (Debug): -Wl,-strip-all -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wundef -Winit-self -Wpointer-arith -Wsign-promo -Wuninitialized -Winit-self -Wno-delete-non-virtual-dtor -Wno-comment -Wno-missing-field-initializers -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -msse -msse2 -msse3 -fvisibility=hidden -fvisibility-inlines-hidden -g -O0 -DDEBUG -D_DEBUG\r\n C Compiler: /usr/lib/ccache/compilers/cc\r\n C flags (Release): -Wl,-strip-all -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wuninitialized -Winit-self -Wno-comment -Wno-missing-field-initializers -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -msse -msse2 -msse3 -fvisibility=hidden -O3 -DNDEBUG -DNDEBUG\r\n C flags (Debug): -Wl,-strip-all -fsigned-char -W -Wall -Werror=return-type -Werror=non-virtual-dtor -Werror=address -Werror=sequence-point -Wformat -Werror=format-security -Wmissing-declarations -Wmissing-prototypes -Wstrict-prototypes -Wundef -Winit-self -Wpointer-arith -Wuninitialized -Winit-self -Wno-comment -Wno-missing-field-initializers -fdiagnostics-show-option -Wno-long-long -pthread -fomit-frame-pointer -ffunction-sections -fdata-sections -msse -msse2 -msse3 -fvisibility=hidden -g -O0 -DDEBUG -D_DEBUG\r\n Linker flags (Release): -L/root/ffmpeg_build/lib -Wl,--gc-sections \r\n Linker flags (Debug): -L/root/ffmpeg_build/lib -Wl,--gc-sections \r\n ccache: YES\r\n Precompiled headers: NO\r\n Extra dependencies: ade /opt/Qt4.8.7/lib/libQtGui.so /opt/Qt4.8.7/lib/libQtTest.so /opt/Qt4.8.7/lib/libQtCore.so /lib64/libz.so /opt/libjpeg-turbo/lib64/libjpeg.a dl m pthread rt\r\n 3rdparty dependencies: ittnotify libprotobuf libwebp libpng libtiff libjasper IlmImf quirc\r\n\r\n OpenCV modules:\r\n To be built: aruco bgsegm bioinspired calib3d ccalib core datasets dnn dnn_objdetect dpm face features2d flann fuzzy gapi hfs highgui img_hash imgcodecs imgproc line_descriptor ml objdetect optflow phase_unwrapping photo plot python3 quality reg rgbd saliency shape stereo stitching structured_light superres surface_matching text tracking video videoio videostab xfeatures2d ximgproc xobjdetect xphoto\r\n Disabled: world\r\n Disabled by dependency: -\r\n Unavailable: cnn_3dobj cudaarithm cudabgsegm cudacodec cudafeatures2d cudafilters cudaimgproc cudalegacy cudaobjdetect cudaoptflow cudastereo cudawarping cudev cvv freetype hdf java js matlab ovis python2 sfm ts viz\r\n Applications: -\r\n Documentation: NO\r\n Non-free algorithms: NO\r\n\r\n GUI: \r\n QT: YES (ver 4.8.7 EDITION = OpenSource)\r\n QT OpenGL support: NO\r\n GTK+: NO\r\n VTK support: NO\r\n\r\n Media I/O: \r\n ZLib: /lib64/libz.so (ver 1.2.3)\r\n JPEG: /opt/libjpeg-turbo/lib64/libjpeg.a (ver 62)\r\n WEBP: build (ver encoder: 0x020e)\r\n PNG: build (ver 1.6.36)\r\n TIFF: build (ver 42 - 4.0.10)\r\n JPEG 2000: build (ver 1.900.1)\r\n OpenEXR: build (ver 1.7.1)\r\n HDR: YES\r\n SUNRASTER: YES\r\n PXM: YES\r\n PFM: YES\r\n\r\n Video I/O:\r\n DC1394: NO\r\n FFMPEG: YES\r\n avcodec: YES (58.47.106)\r\n avformat: YES (58.26.101)\r\n avutil: YES (56.26.100)\r\n swscale: YES (5.4.100)\r\n avresample: NO\r\n GStreamer: NO\r\n v4l/v4l2: YES (linux/videodev2.h)\r\n\r\n Parallel framework: pthreads\r\n\r\n Trace: YES (with Intel ITT)\r\n\r\n Other third-party libraries:\r\n Lapack: NO\r\n Eigen: NO\r\n Custom HAL: NO\r\n Protobuf: build (3.5.1)\r\n\r\n OpenCL: YES (no extra features)\r\n Include path: /io/opencv/3rdparty/include/opencl/1.2\r\n Link libraries: Dynamic load\r\n\r\n Python 3:\r\n Interpreter: /opt/python/cp36-cp36m/bin/python (ver 3.6.8)\r\n Libraries: libpython3.6m.a (ver 3.6.8)\r\n numpy: /opt/python/cp36-cp36m/lib/python3.6/site-packages/numpy/core/include (ver 1.11.3)\r\n install path: python\r\n\r\n Python (for build): /opt/python/cp36-cp36m/bin/python\r\n\r\n Java: \r\n ant: NO\r\n JNI: NO\r\n Java wrappers: NO\r\n Java tests: NO\r\n\r\n Install to: /io/_skbuild/linux-x86_64-3.6/cmake-install\r\n-----------------------------------------------------------------\r\n\r\n\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Program overwriting existing images every time I run it\n\n(exhentai)\r\n\r\nFor some reason, when I ran the program a second time with a new batch of files, I couldn't find the images from the previous batch. I suspect the program overwrote it-- if not, then I have no idea what happened.\r\n\r\nAlso, the program accesses files that have already been downloaded, increasing the counter toward image limits. Is this intended?\r\n\r\nIs there an exhentai config files around somewhere?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Pre-release metadata ignored in sorting of version strings\n\nAs reported by @wolfgangmm in this week's Community Call, with eXist Documentation 5.0.0-RC1 installed, the Package Manager did not report the availability of the same app's 5.0.0-RC3 version. \r\n\r\nI traced this to the function `packages:is-newer()` in https://github.com/eXist-db/existdb-packageservice/blob/94202f09413077368fc78dfe8f764b5af7071587/modules/packages.xqm#L341-L359 where the versions of the newest available package are compared to the currently installed version. This code only compares the major, minor, and patch release identifiers of a version string\u2014much in the same way the public-repo code did before https://github.com/eXist-db/public-repo/pull/44.\r\n\r\nThis code should be replaced with https://github.com/eXist-db/semver.xq, with its robust and tested handling of SemVer 2.0 version strings and ability to coerce non-compliant strings into shape.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Bug: EffectsModule, StoreRouterConnectingModule and StoreDevtoolsModule need to be imported only after StoreModule","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Revise the README","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"wishlist for next version","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"kernelinfo doesn't compile for linux_osi","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# How long did you train an agent\uff1f\n\nHello, thank you and @SleepingFox88 for your contribution.\r\nMay I know how long it took you to train an agent\uff1fMaybe one day or two days? My agent is always not very good. Is it because the training time is too short?\r\nThank you very much.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"rpi:pkgsrc/security/libgcrypt","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Server using SWAP memory while there is enough host memory available\n\nCommand:\r\n```\r\ndocker pull itzg/minecraft-server\r\ndocker stop mc\r\ndocker rm mc\r\ndocker run -d -it --restart always \\\r\n -e EULA=TRUE -e VERSION=1.14 \\\r\n -e MAX_MEMORY=32G -e MAX_RAM=32G \\\r\n -e MIN_MEMORY=8G -e MIN_RAM=8G \\\r\n -e PERMGEN_SIZE=4G -p 25565:25565 \\\r\n -v /home/minecraft/data:/data \\\r\n --name mc \\\r\n itzg/minecraft-server\r\n```\r\n\r\n`free --giga -h` output:\r\n```\r\n total used free shared buff/cache available\r\nMem: 515G 10G 1.9G 28M 503G 502G\r\nSwap: 5.8G 5.4G 401M\r\n```\r\n\r\n` smem -s swap -t -n -k` output:\r\n```\r\n PID User Command Swap USS PSS RSS\r\n(...)\r\n36671 1000 mc-server-runner --stop-dur 3.4M 920.0K 920.0K 924.0K\r\n36074 0 /usr/bin/docker-proxy -prot 4.4M 212.0K 422.0K 2.1M\r\n36088 0 /usr/bin/docker-proxy -prot 10.3M 212.0K 422.0K 2.1M\r\n36461 0 /usr/local/libexec/ipsec/pl 12.6M 8.9M 9.2M 9.7M\r\n 1238 0 /usr/bin/dockerd -H fd:// - 13.7M 65.4M 65.5M 68.3M\r\n 1225 0 /usr/bin/containerd 20.0M 54.9M 54.9M 56.6M\r\n36776 1000 java -XX:+UseG1GC -Xms1G -X 5.0G 8.4G 8.4G 8.4G\r\n-------------------------------------------------------------------------------\r\n 131 11 5.1G 9.2G 9.3G 9.9G\r\n```\r\n\r\nThe minecraft server is allowed to use 32GB of memory. It uses 9.2GB of memory and 5.1GB of swap. Total: 14.3GB. Consuming 99% of host SWAP memory when there's 502GB of RAM available is not acceptable. SWAP is meant only for writing to memory mapped files (doesn't require much space) and recovery actions/purposes on servers with plenty of RAM (when a process goes 'rogue' and leaks memory like a madman). Other docker containers don't exhibit this issue.\r\n\r\nThis issue has been described (KB000736) by docker and should be fixed with the mentioned steps:\r\n\r\nhttps://success.docker.com/article/node-using-swap-memory-instead-of-host-memory\r\n\r\nIf this is intended/desired behavior I would like to request help with preventing the use of SWAP by the minecraft docker container(s): Are there parameters to disable this behavior?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"help page","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# [SUGGESTION]\u00a0Use standard a CommonJS export\n\nCurrently, `callback-to-async-iterator` is written using ES6 modules, and transpiled to CommonJS using [babel-plugin-transform-modules-commonjs](https://babeljs.io/docs/en/babel-plugin-transform-modules-commonjs).\r\n\r\nThe export in the result file in the `dist` folder looks like this:\r\n``` js\r\nexports.default = callbackToAsyncIterator;\r\n```\r\n\r\nwhich means in a standard CommonJS environment, you have to write something like this to use it:\r\n```js\r\nconst callbackToAsyncIterator = require('callback-to-async-iterator').default\r\n```\r\n\r\nThis is a bit unexpected, because I think this module is primarily targeted at NodeJS environments, and the docs don't mention it.\r\n\r\nI would suggest using regular CommonJS exports in the source code, to make sure the exported `dist/index.js` file looks like this:\r\n```js\r\nmodule.exports = callbackToAsyncIterator\r\n```\r\n\r\nThis will probably not make any differences for people using it in a ES6 module environment, because tools like Babel or Webpack usually [interop quite nicely with CommonJS modules](https://2ality.com/2015/12/babel-commonjs.html#default-imports).\r\n\r\nWhat are your thoughts on this? I can work on a PR if that sounds good","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Better README","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Update the README.md","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Problems with version 2.4.15 and slider\n\nProblems with version 2.4.15 and slider\r\n\r\n### Issue type:\r\n- [ ] Feature request <!-- Requesting the implementation of a new feature -->\r\n- [x] Bug report <!-- Reporting unexpected or erroneous behavior --> \r\n- [ ] Documentation <!-- Proposing a modification to the documentation -->\r\n\r\n### Environment:\r\n* AdminLTE Version: 2.4.15\r\n* Operating System: macos\r\n* Browser (Version): Chrome (Latest)\r\n\r\n### Description:\r\n\r\nIm trying to use version **2.4.15** but getting error:\r\n\r\n```\r\nmake composer-update docker=1\r\ndocker exec y2aa_php_fpm composer update\r\nDo not run Composer as root/super user! See https://getcomposer.org/root for details\r\nLoading composer repositories with package information\r\nUpdating dependencies (including require-dev)\r\nYour requirements could not be resolved to an installable set of packages.\r\n\r\n Problem 1\r\n - Installation request for bower-asset/admin-lte 2.4.15 -> satisfiable by bower-asset/admin-lte[v2.4.15].\r\n - bower-asset/admin-lte v2.4.15 requires bower-asset/bootstrap-slider 879-alpha98 -> no matching package found.\r\n\r\nPotential causes:\r\n - A typo in the package name\r\n - The package is not available in a stable-enough version according to your minimum-stability setting\r\n see <https://getcomposer.org/doc/04-schema.md#minimum-stability> for more details.\r\n - It's a private package and you forgot to add a custom repository to find it\r\n\r\nRead <https://getcomposer.org/doc/articles/troubleshooting.md> for further common problems.\r\nmake: *** [composer-update] Error 2\r\n```\r\n\r\nMy composer:\r\n\r\nhttps://github.com/prsolucoes/yii2-app-advanced/blob/mega-update/composer.json\r\n\r\nThanks for any help.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"segmentationfault in amqp sample if started without internet connection and afterwards connecting again","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add element for Disclaimer text","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Xdebug remote_host is ignored by remote_connect_back\n\nIn your dockerfile you setup the host ip as xdebug _remote_host_:\r\nhttps://github.com/jorge07/alpine-php/blob/c3f14879245d680596be25513b2d9052e3874663/7.3/Dockerfile.dev#L32\r\n\r\nBut you leave the _remote_connect_back_ set to 1\r\nhttps://github.com/jorge07/alpine-php/blob/c3f14879245d680596be25513b2d9052e3874663/7.3/devfs/etc/php7/conf.d/00_xdebug.ini#L9\r\n\r\nWhile as the documentation explains:\r\n\r\n> **xdebug.remote_connect_back**\r\nIf enabled, the _xdebug.remote_host_ setting is ignored and Xdebug will try to connect to the client that made the HTTP request.\r\n\r\nTherefore, souldn't you want the _remote_connect_back_ set to 0?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[Feature] webpack config being able to process either sass or scss ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Old Documentation on nginx\n\n**Describe the bug, try to make it reproducible**\r\nIt's not really a bug, more a depreciated documentation.\r\n[https://mailcow.github.io/mailcow-dockerized-docs/u_e-webmail-site/](https://mailcow.github.io/mailcow-dockerized-docs/u_e-webmail-site/)\r\n\r\nThe documentation is about editing the _data/conf/nginx/webmail.conf_ to add a subdomain for the SoGo Webmail Client.\r\nBut _data/conf/nginx/webmail.conf_ doesn't exist.\r\n\r\nFurther information (where applicable):\r\n- mailcow-dockerized is on the latest version\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Update Administrator Documentation","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Convert processor fails to convert float to integer","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# API Docs should include function in short description.\n\nNow that the first version of the API docs for this library have been created, it's time to refine them. After using them a bit, I think it would help users if the function call was included in the short description. \r\n\r\nAn example of the changes required to fulfill the scope of this issue has been implemented with the Mnemonic library. A screenshot will be attached below.\r\n\r\nSome people will want to look through functions based on their behavior, as is captured currently. But some users will want to quickly locate the documentation for a function they already know exists. Having the function in the short description accomplishes this.\r\n\r\nRight now, the short description is rather squished into the left column. A future issue could be to reformat the column so increase its width, thought that is outside the scope of this function.\r\n\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# What is the proper way to use `Transaction` with multiple prepared queries\n\nI'm trying to create a transaction of multiple prepared statements inside\r\n1. It's TypeScript code: \r\n```\r\nconst request = new Request (transaction ? transaction : pool);\r\n /* or */ \r\nconst stmt = new PreparedStatement(transaction ? transaction : pool);\r\nconst query = ``;\r\n```\r\nTS compiler complaints about `Request` can't get `ConnectionPool | Transaction` type. So I can't continue to develop because of these errors\r\n\r\n2. I can't find proper examples in the docs of using transactions with multiple prepared statements \r\nAnd maybe TS examples\r\n\r\n * NodeJS: 12.8\r\n * node-mssql: 5.1.0\r\n * msnodesqlv8: 0.8.3\r\n * SQL Server: 16","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add usage in README.md","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Include more compilers on travis.ci","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# We should add documentation regarding widget configuration\n\nI am using this project for a side project and I wanted to add `maximumSelectionLength` attribute to my `Select2MultipleWidget`, and I was scouring through the docs and found nothing. I tried to edit the `django_select2.js` static file in `/static/django_select2/` (I know, hacky, but I was out of options \ud83d\ude1b)\r\n\r\nA complete reference of Select2's config is available here: https://select2.org/configuration/options-api\r\n\r\nThough, I think a minor difference is the config we provide to django_select2 must begin with `data-` and the config name must be lowercase, with each word separated by a hyphen.\r\n\r\nSo `maximumSelectionLength` becomes `data-maximum-selection-length`\r\n\r\nThen I found out I had to provide a dict of config like `Select2MultipleWidget({'data-maximum-selection-length': 5})` to make it work.\r\n\r\nI'd like to add a documentation page regarding configuring widget attributes so that it'll be a good reference for the future and help others, too.\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Documentation: Starting the server","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Create readme","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# yarn deploy error\n\n**Describe the bug**\r\nerror Command \"graphql\" not found.\r\n\r\n**To Reproduce**\r\nSteps to reproduce the behavior:\r\nyarn run deploy\r\n\r\nSolution: \r\nyarn add graphql-cli\r\n\r\nProbably you have installed globaly.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Lakka Generic PC issues\n\nCPU: Intel Haswell Intel\u00a9 Core\u2122 i5-4670 CPU @ 3.40GHz \u00d7 4\r\nGPU: N/A\r\nRAM: 4gb \r\nSSD 60gb\r\n\r\n1.One of the usual annoyances I find with Lakka OS is right off the bat when it is successfully installed is that you have to fiddle with the audio setting because there is no sound and it was not letting me cycle through the list untill I rebooted it.\r\n\r\n2. PPSSPP has issues with audio and game saving. It's as if the function has been removed and so far I had to resort to using save states instead but this does not work right with certain titles that need multiple playthroughs for new game+. When attempting to save the screen goes black and you can hear the emulation dropping in quality such as audio and FPS.\r\nnote: i have the audio on low latency setting. \r\n\r\n3. Lakka OS lacks documentation for bios placement. For example PSx do I make a folder under system/ or do I just dump the rest of the bios there. For example would it work like this /system/psx \r\n\r\n4. No mention of bios placement for Atari 5200 as well.\r\n\r\n5. Lakka site does not list what emulator cores are available, where is PS2 emulation core at? The latest Retroarch has it included.\r\n\r\n6.I wanted to delete a entry from a game rom that was on a different drive. But now I'm stuck with a double of the listed game rom.\r\n\r\n7. Screenshots show 0% when taking one through the menu but they actually are taken despite never showing 100%\r\n\r\n8.PPSSPP sometimes when I load a save state it crashes the core. save state loading might start from 75% to 85% when it does happen. \r\n\r\nSo far I have found Lakka OS to be frustrating and I doubt this is all the issues I will find so far with version Lakka-Generic x86_64-2.2.2\r\n\r\nI don't want to install retroarch on Linux Mint because i find it a waste of resources with all the other OS programs/resources running will take away from game emulation. Which is why I like the idea of something like Lakka OS","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation needed to clarify usage","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Commits to backport to 0.179-t","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"How do I do this? [transpiler]","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Ionic Apprate Usesuntilprompt","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# How to use Your wouderful program\n\nI Was looking for a program that converts a DPGL data to CSV (or popular one like that) on python.\r\nand I have found this Great one.\r\nBut there is no useful readme.\r\nCould you tell me how to use this one, please.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation: Emacs server (--daemon) and exwm-enable","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Upgrade to Java 1.6","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation improvement regarding weechat-dev dependency.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"How can we load the edits in another html page without rendering the editor itself","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Implement validity checks for each geometry type","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Document how to get started","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Feedback","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[PRODUCTION] Displaying of Swagger file doesn't work properly in some cases","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Which version is suitable for RN 0.59.10\n\nI can't find my answer in readme section.\r\n\r\nI've not migrated to RN +0.60 yet. can I use the latest version of this library and if the answer is no, then what is the latest stable version that compatible with RN 0.59.10?\r\n\r\nthanks.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Missing simulation.launch","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add the hammerhead.EVENTS.unhandledRejection event","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update to Angular 4","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"There are no commands defined in the \"permission\" namespace.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"RPC Interface Returning Errors","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Add architecture part to README.md\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"kubectl --as doesn't work inside pod","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Update use of a few ImGui methods and constants\n\nThanks again for Oryol!\r\n\r\nI could be doing something wrong, but following the README.md exactly, I get errors during an `osx-xcode-debug` fips build. Some build error messages are included at end of this text, but I think I found the solution.\r\n\r\nIt seems that some imgui breaking changes are breaking some of these samples. \r\n\r\nhttps://github.com/rdroe/oryol-samples/tree/imgui-updates gets past my build errors. I have also done quick manual checks of each previously broken demo, and it seems all right in the branch just mentioned. \r\n\r\nIf I'm right, the relevant ImGui updates are mentioned in the Api Breaking Changes section around https://github.com/ocornut/imgui/blob/62143dff64c8cfb960248b84cf6c4566ffc5a743/imgui.cpp#L374 (with further references mentioned there). \r\n\r\nImGui::IsMouseHoveringAnyWindow() --> ImGui::IsWindowHovered(ImGuiHoveredFlags_AnyWindow)\r\nImGuiSetCond_Once --> ImGuiCond_Once \r\nImGuiSetCond_FirstUserEver --> ImGuiCond_FirstUseEver\r\n\r\n### Error text\r\n\r\n```\r\n/Users/robertroe/sites/projects-2/oryol-samples/src/ImGuiDemo/ImGuiDemo.cc:62:51: error: use of undeclared identifier 'ImGuiSetCond_FirstUseEver'; did you\r\n mean 'ImGuiCond_FirstUseEver'?\r\n ImGui::SetNextWindowSize(ImVec2(200,100), ImGuiSetCond_FirstUseEver);\r\n ^~~~~~~~~~~~~~~~~~~~~~~~~\r\n ImGuiCond_FirstUseEver\r\nIn file included from /Users/robertroe/sites/projects-2/oryol-samples/src/ImGuiDemo/ImGuiDemo.cc:9:\r\nIn file included from /Users/robertroe/sites/projects-2/oryol-imgui/src/IMUI/IMUI.h:14:\r\nIn file included from /Users/robertroe/sites/projects-2/oryol-imgui/src/IMUI/imguiWrapper.h:10:\r\n/Users/robertroe/sites/projects-2/fips-imgui/imgui/imgui.h:1189:5: note: 'ImGuiCond_FirstUseEver' declared here\r\n ImGuiCond_FirstUseEver = 1 << 2, // Set the variable if the object/window has no persistently saved data (no entry in .ini file)\r\n ^\r\n/Users/robertroe/sites/projects-2/oryol-samples/src/ImGuiDemo/ImGuiDemo.cc:70:50: error: use of undeclared identifier 'ImGuiSetCond_FirstUseEver'; did you\r\n mean 'ImGuiCond_FirstUseEver'?\r\n ImGui::SetNextWindowPos(ImVec2(460, 20), ImGuiSetCond_FirstUseEver);\r\n ^~~~~~~~~~~~~~~~~~~~~~~~~\r\n ImGuiCond_FirstUseEver\r\n```\r\n\r\nAlso \r\n\r\n```\r\n/Users/robertroe/sites/projects/oryol-samples/src/OrbViewer/Main.cc:127:17: error: no member named 'IsMouseHoveringAnyWindow' in namespace 'ImGui'\r\n if (!ImGui::IsMouseHoveringAnyWindow()) {\r\n ~~~~~~~^\r\n1 error generated.\r\n\r\n** BUILD FAILED **\r\n```\r\n\r\nAnd one more is like the first, but the constant is named `ImGuiSetCond_FirstUseEver`, needing to change to `ImGuiCond_FirstUseEver` (but please double-check the branch with all the tests you know of).\r\n\r\n\r\n\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# CVE-2019-6284 (Medium) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2019-6284 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>\n<p>\n\n<p>A Java based fault and performance management system</p>\n<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nIn LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::alternatives in prelexer.hpp.\n\n<p>Publish Date: 2019-01-14\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284>CVE-2019-6284</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284</a></p>\n<p>Release Date: 2019-08-06</p>\n<p>Fix Resolution: 3.6.0</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"does dCache use ACLs in Zookeper?!","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"amp-image: Placeholder fails to hide when unsupported source is supplied","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Circle: set color, etc.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Compiling 1B overhead markers\n\nI'm thinking of going through the repository and identifying all the known 1B log lines so they can be dropped into a single file for easy reference. In four-player content it's usually not too bad to isolate the 1B lines and figure things out, but in 24-player content it's not always simple. It (seems to me it) would be nice to just be able to search a log for a given marker type rather than sorting through all 1B lines.\r\n\r\nI envision this as a .js-formatted text file similar to the [triggers readme](https://github.com/quisquous/cactbot/blob/master/ui/raidboss/data/triggers/README.txt). A sample would be something like this:\r\n\r\n```\r\n// 0028\r\n// Earth Shaker\r\n// Damage cone--don't overlap and don't point at other players\r\n// Seen in Sephirot EX, St Mocianne (HARD) and Dun Scaith, among others.\r\n/ 1B:........:(\\y{Name}):....:....:0028/\r\n```\r\n\r\nObviously this isn't the final revision on the formatting, but it should be a good baseline. Is this something that might be worth doing, or would it be useful to anyone if I take the time to do it?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[Documentation] Adding a custom translatable model","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Need the README\n\nAs title, this project misses `README`.\r\n\r\nI think it should have this doc and let developers know how to use this PHP package.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Users should be able to log in and log out","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Rewrite of grid-area values","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Render Audio normally but Video to ByteBuffer only?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"TinyTracker project","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Resolve broken API definition for Oathkeeper\n\nSee https://www.ory.sh/docs/oathkeeper/sdk/api#judge-if-a-request-should-be-allowed-or-not - which should be /decisions","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# BitConverter: ToDouble, ToSingle, Int64BitsToDouble, Int32BitsToDouble\n\n**Link to article:**\r\nhttps://docs.microsoft.com/en-us/dotnet/api/system.bitconverter.todouble?view=netstandard-2.0\r\n\r\n**Problem:**\r\n\r\nMention that it might not be possible to use these methods to produce what are so-called _signaling NaN_s in x86, x64, and certain other instruction set architectures. A _signaling NaN_ is an NaN with the most significant bit of the mantissa area clear (as opposed to a _quiet NaN_ where that bit is set). An example of a signaling NaN is as follows (with little-endian byte ordering):\r\n\r\n new byte[]{ 1, 0, 0, 0, 0, 0, 0xf0, 0x7f };\r\n\r\nAt least in x86, using ToDouble and converting the resulting `double` back will convert the signaling NaN to a quiet NaN as follows:\r\n\r\n new byte[]{ 1, 0, 0, 0, 0, 0, 0xf8, 0x7f };\r\n\r\nThis can be an issue if the application works with binary serialization formats that store `double` or `float` values in the form of the bits that make them up, because converting those bits to `double` or `float` with these BitConverter methods may make round-tripping that `double` or `float` not always possible.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Would notification drawer benefit from a general empty state?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# BRAT Version 3.1.00 is here!\n\n@wally-mac @bangen @joewheaton \r\n\r\n@tyler1218hatch and I have finally finished up doing a complete overhaul of BRAT and I just released the new version as 3.1.00. This issue is just to document the major improvements that are contained in this release. In order of the tools, these include:\r\n\r\n- BRAT Project Builder: Incorporating beaver dams and land ownership in BRAT folder structure and adding layers.\r\n- BRAT Table: Adding the perennial stream network and incorporating it in finding clusters, identifying points of diversion, calculating distance to points of diversion and including this distance in the overall distance to nearest infrastructure, identifying land ownership by reach, and calculating distance to private land. Layers have also been made for all of the above.\r\n- BRAT Braid Handler: Using the stream name to identify main channels.\r\n- iHyd: Adding the hydrological equations as inputs, rather than directing users to edit the code.\r\n- Vegetation Capacity FIS: Extending the upper limit of the capacity model.\r\n- Combined Capacity FIS: Extending the upper limit of the capacity model.\r\n- Conservation Restoration: Incorporating the TNC strategies map as an optional output, renaming of categories for accuracy, and identifying any canals as \"major risk\".\r\n- Data Capture Validation: Adding drainage area threshold as optional input to limit snapping of dams far from the main channel, adding \"Snapped\" field to dams to identify which were used in the validation. Calculating and making layers comparing existing vs. historic capacity, comparing BRAT estimated vs. surveyed dam counts, and % of estimated capacity occupied by dams. Parsing the \"Easiest - Low-hanging fruit\" category into conservation and restoration categories based on the proportion of capacity occupied.\r\n- Layer Package Generator: Adding clipping network functionality, including all new layers, and modifying the ordering so dam density is always shown before dam count in the capacity layers.\r\n- Collect Summary Reports: Completely new. \r\n- BRAT toolbox: Descriptions added to user interface, default values for output names.\r\n- Batch scripts: Cleaned up versions of all batch scripts are provided in the Supporting Tools/Batch Scripts folder.\r\n\r\nLastly, we have improved the website documentation to reflect these changes and updated all in-code documentation as fit, including @tyler1218hatch making all BRAT scripts PEP8 compliant. \r\n\r\n@tyler1218hatch Am I missing anything?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Weasyprint\n\nThanks for creating an issue! But first: did you read our community guidelines?\r\nhttps://cuckoo.sh/docs/introduction/community.html\r\n\r\n##### My issue is: Weasyprint\r\n\r\n2019-08-10 19:00:57,471 [cuckoo.core.plugins] WARNING: The reporting module \"SingleFile\" returned the following error: The weasyprint library hasn't been installed on your Operating System and as such we can't generate a PDF report for you. You can install 'weasyprint' manually by running 'pip install weasyprint' or by compiling and installing package yourself.\r\n\r\n##### My Cuckoo version and operating system are: 2.07 Ubuntu 18.04\r\n\r\nI have tried to install weasyprint .038 and the most recent version, all which have produced errors. What version of weasyprint do I need install. \r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Create README for strings > only_letters","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Could not create model","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# INTERNAL ERROR on new node after Fast Deploy\n\n### Description of the issue\r\nDescribe the issue you are experiencing.\r\n\r\n### Issue-Type (put a `x` sign in the square brackets)\r\n- [ x] bug report\r\n- [ ] feature request\r\n- [ ] Documentation improvement\r\n\r\n### Checklist\r\n- [x] Running latest version of code.\r\n- [ ] This issue has not been reported earlier.\r\n\r\n### Your environment\r\n* OS ubuntu 16\r\n* Go version 1.12.7\r\n* Release tag/commit of the code\r\n\r\n### Expected behaviour\r\nWhat should happen?\r\n\r\nThe node should start syncing\r\n\r\n### Actual behaviour\r\nWhat is actually happening?\r\nNo blocks are syncing and there is an error in the log all the time.\r\n\r\n### Steps to reproduce\r\n1. Install the latest code of the node via fastdeploy\r\n2. Check the log\r\n\r\n\r\n### Any extra info ( for eg. code snippet to reproduce, logs, etc. )\r\nIf necessary, provide some extra information like code-snippets or error-logs.\r\n\r\n2019/08/10 17:02:32.876010 \u001b[0;32m[INFO ]\u001b[m GID 1, My IP address is x.x.x.x\r\n2019/08/10 17:02:32.910349 \u001b[0;32m[INFO ]\u001b[m GID 1, database Version: 1\r\n2019/08/10 17:02:32.915003 \u001b[0;32m[INFO ]\u001b[m GID 1, state root: (long string of random numbers)\r\n2019/08/10 17:02:32.923956 \u001b[0;32m[INFO ]\u001b[m GID 1, get no ID from local ledger\r\n2019/08/10 17:02:33.433209 \u001b[0;32m[INFO ]\u001b[m GID 1, GetID got resp: {\"error\":{\"code\":-45022,\"data\":null,\"message\":\"INTERNAL ERROR, there is no ID in account\"},\"id\":\"1\",\"jsonrpc\":\"2.0\"} from http://mainnet-seed-0035.nkn.org:30003\r\n\r\n2019/08/10 17:02:33.840712 \u001b[0;32m[INFO ]\u001b[m GID 1, GetID got resp: {\"error\":{\"code\":-45022,\"data\":null,\"message\":\"INTERNAL ERROR, there is no ID in account\"},\"id\":\"1\",\"jsonrpc\":\"2.0\"} from http://mainnet-seed-0013.nkn.org:30003\r\n\r\n2019/08/10 17:02:34.039211 \u001b[0;32m[INFO ]\u001b[m GID 1, GetID got resp: {\"error\":{\"code\":-45022,\"data\":null,\"message\":\"INTERNAL ERROR, there is no ID in account\"},\"id\":\"1\",\"jsonrpc\":\"2.0\"} from http://mainnet-seed-0020.nkn.org:30003\r\n\r\n2019/08/10 17:02:34.363051 \u001b[0;32m[INFO ]\u001b[m GID 1, GetNonceByAddr got resp: {\"id\":\"1\",\"jsonrpc\":\"2.0\",\"result\":{\"currentHeight\":154254,\"nonce\":0,\"nonceInTxPool\":1}} from http://mainnet-seed-0008.nkn.org:30003\r\n\r\nAnd so on...","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"ReferenceError: Textarea is not defined","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Not able to install with the given instructions.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Document the client library categories per component","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add instructions on the main game page not on github :P","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Merge pull request #4924 from SrNetoChan/dialog_titles","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Prisma client Subsciption $fragments for sub types\n\n**Article Title (if relevant)**\r\n[https://www.prisma.io/docs/prisma-client/features/realtime-JAVASCRIPT-rsc8/](url)\r\n\r\n**Describe the content issue**\r\nUsing prisma client I can't find a way to pass **$fragment** in subscriptions\r\nFor example the following subscription \r\n\r\n`prisma.subscribe.post({\r\n node: {\r\n createdAt_lte: \"DATETIME\"\r\n }\r\n}).node()`\r\n\r\nI want to get the mutation type, the posts and the comments for this post\r\n\r\n**Describe the solution you'd like to see**\r\nWith prisma binding I was able to pass the info or any fragments I need like this\r\n\r\n`prisma.subscription.post({\r\n where {\r\n node: {\r\n createdAt_lte: \"DATETIME\"\r\n }\r\n }\r\n}, '{ \r\n mutation\r\n previousValues { id } \r\n node { id status comments { id title } }\r\n }')`\r\n\r\nIs there Anyway to do the same with prisma client to get the mutation, and sub fields in the response?\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Swift-SMTP - Allow headers to be replaced","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Ability to convert IllegalArgumentException as an HTTP 400 (Bad Request) response","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Get developer docs up to polish with a getting started guide","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Sails.getDatastores is not a function","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"thoughts on colorful test output","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# [firebase_ml_vision] breaks build on iOS\n\n\r\n## Steps to Reproduce\r\n\r\nI tried to reproduce in a new project but works fine. I'm having this issue on an existing project. \r\n\r\n1. Add firebase_ml_vision dependency\r\n2. flutter run\r\n\r\n## Logs\r\n\r\n```\r\n \"TOCropViewController\" -framework \"cloud_firestore\" -framework \"firebase_core\" -framework \"firebase_database\"\r\n -framework \"flutter_full_pdf_viewer\" -framework \"fluttertoast\" -framework \"grpc\" -framework \"grpcpp\" -framework\r\n \"image_cropper\" -framework \"image_picker\" -framework \"leveldb\" -framework \"nanopb\" -framework \"openssl_grpc\"\r\n -framework \"path_provider\" -framework \"printing\"\r\n OTHER_SWIFT_FLAGS = -D COCOAPODS -D COCOAPODS\r\n PACKAGE_TYPE = com.apple.package-type.wrapper.application\r\n PASCAL_STRINGS = YES\r\n PATH =\r\n /Applications/Xcode.app/Contents/Developer/usr/bin:/usr/local/opt/node@10/bin:/usr/local/bin:/usr/bin:/bin:/usr/s\r\n bin:/sbin:/Users/andres/Documents/Development/flutter/bin\r\n PATH_PREFIXES_EXCLUDED_FROM_HEADER_DEPENDENCIES = /usr/include /usr/local/include /System/Library/Frameworks\r\n /System/Library/PrivateFrameworks /Applications/Xcode.app/Contents/Developer/Headers\r\n /Applications/Xcode.app/Contents/Developer/SDKs /Applications/Xcode.app/Contents/Developer/Platforms\r\n PBDEVELOPMENTPLIST_PATH = Runner.app/pbdevelopment.plist\r\n PFE_FILE_C_DIALECTS = objective-c\r\n PKGINFO_FILE_PATH =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/PkgInfo\r\n PKGINFO_PATH = Runner.app/PkgInfo\r\n PLATFORM_DEVELOPER_APPLICATIONS_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/Applications\r\n PLATFORM_DEVELOPER_BIN_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin\r\n PLATFORM_DEVELOPER_LIBRARY_DIR =\r\n /Applications/Xcode.app/Contents/PlugIns/Xcode3Core.ideplugin/Contents/SharedSupport/Developer/Library\r\n PLATFORM_DEVELOPER_SDK_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs\r\n PLATFORM_DEVELOPER_TOOLS_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/Tools\r\n PLATFORM_DEVELOPER_USR_DIR = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr\r\n PLATFORM_DIR = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform\r\n PLATFORM_DISPLAY_NAME = iOS\r\n PLATFORM_NAME = iphoneos\r\n PLATFORM_PREFERRED_ARCH = arm64\r\n PLATFORM_PRODUCT_BUILD_VERSION = 16G73\r\n PLIST_FILE_OUTPUT_FORMAT = binary\r\n PLUGINS_FOLDER_PATH = Runner.app/PlugIns\r\n PODS_BUILD_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n PODS_CONFIGURATION_BUILD_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos\r\n PODS_PODFILE_DIR_PATH = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/.\r\n PODS_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods\r\n PRECOMPS_INCLUDE_HEADERS_FROM_BUILT_PRODUCTS_DIR = YES\r\n PRECOMP_DESTINATION_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/PrefixHea\r\n ders\r\n PRESERVE_DEAD_CODE_INITS_AND_TERMS = NO\r\n PRIVATE_HEADERS_FOLDER_PATH = Runner.app/PrivateHeaders\r\n PRODUCT_BUNDLE_IDENTIFIER = co.com.creece.appsolidariav2\r\n PRODUCT_MODULE_NAME = Runner\r\n PRODUCT_NAME = Runner\r\n PRODUCT_SETTINGS_PATH = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Runner/Info.plist\r\n PRODUCT_TYPE = com.apple.product-type.application\r\n PROFILING_CODE = NO\r\n PROJECT = Runner\r\n PROJECT_DERIVED_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/DerivedSources\r\n PROJECT_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/ios\r\n PROJECT_FILE_PATH = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Runner.xcodeproj\r\n PROJECT_NAME = Runner\r\n PROJECT_TEMP_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build\r\n PROJECT_TEMP_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n PROVISIONING_PROFILE_REQUIRED = YES\r\n PUBLIC_HEADERS_FOLDER_PATH = Runner.app/Headers\r\n RECURSIVE_SEARCH_PATHS_FOLLOW_SYMLINKS = YES\r\n REMOVE_CVS_FROM_RESOURCES = YES\r\n REMOVE_GIT_FROM_RESOURCES = YES\r\n REMOVE_HEADERS_FROM_EMBEDDED_BUNDLES = YES\r\n REMOVE_HG_FROM_RESOURCES = YES\r\n REMOVE_SVN_FROM_RESOURCES = YES\r\n RESOURCE_RULES_REQUIRED = YES\r\n REZ_COLLECTOR_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/ResourceM\r\n anagerResources\r\n REZ_OBJECTS_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/ResourceM\r\n anagerResources/Objects\r\n SCAN_ALL_SOURCE_FILES_FOR_INCLUDES = NO\r\n SCRIPTS_FOLDER_PATH = Runner.app/Scripts\r\n SDKROOT = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS12.4.sdk\r\n SDK_DIR = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS12.4.sdk\r\n SDK_DIR_iphoneos12_4 =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS12.4.sdk\r\n SDK_NAME = iphoneos12.4\r\n SDK_NAMES = iphoneos12.4\r\n SDK_PRODUCT_BUILD_VERSION = 16G73\r\n SDK_VERSION = 12.4\r\n SDK_VERSION_ACTUAL = 120400\r\n SDK_VERSION_MAJOR = 120000\r\n SDK_VERSION_MINOR = 400\r\n SED = /usr/bin/sed\r\n SEPARATE_STRIP = NO\r\n SEPARATE_SYMBOL_EDIT = NO\r\n SET_DIR_MODE_OWNER_GROUP = YES\r\n SET_FILE_MODE_OWNER_GROUP = NO\r\n SHALLOW_BUNDLE = YES\r\n SHARED_DERIVED_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/DerivedSources\r\n SHARED_FRAMEWORKS_FOLDER_PATH = Runner.app/SharedFrameworks\r\n SHARED_PRECOMPS_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/SharedPrecompiledHeaders\r\n SHARED_SUPPORT_FOLDER_PATH = Runner.app/SharedSupport\r\n SKIP_INSTALL = NO\r\n SOURCE_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/ios\r\n SRCROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/ios\r\n STRINGS_FILE_OUTPUT_ENCODING = binary\r\n STRIP_BITCODE_FROM_COPIED_FILES = YES\r\n STRIP_INSTALLED_PRODUCT = YES\r\n STRIP_STYLE = all\r\n STRIP_SWIFT_SYMBOLS = YES\r\n SUPPORTED_DEVICE_FAMILIES = 1,2\r\n SUPPORTED_PLATFORMS = iphonesimulator iphoneos\r\n SUPPORTS_TEXT_BASED_API = NO\r\n SWIFT_PLATFORM_TARGET_PREFIX = ios\r\n SYMROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n SYSTEM_ADMIN_APPS_DIR = /Applications/Utilities\r\n SYSTEM_APPS_DIR = /Applications\r\n SYSTEM_CORE_SERVICES_DIR = /System/Library/CoreServices\r\n SYSTEM_DEMOS_DIR = /Applications/Extras\r\n SYSTEM_DEVELOPER_APPS_DIR = /Applications/Xcode.app/Contents/Developer/Applications\r\n SYSTEM_DEVELOPER_BIN_DIR = /Applications/Xcode.app/Contents/Developer/usr/bin\r\n SYSTEM_DEVELOPER_DEMOS_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Utilities/Built Examples\r\n SYSTEM_DEVELOPER_DIR = /Applications/Xcode.app/Contents/Developer\r\n SYSTEM_DEVELOPER_DOC_DIR = /Applications/Xcode.app/Contents/Developer/ADC Reference Library\r\n SYSTEM_DEVELOPER_GRAPHICS_TOOLS_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Graphics Tools\r\n SYSTEM_DEVELOPER_JAVA_TOOLS_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Java Tools\r\n SYSTEM_DEVELOPER_PERFORMANCE_TOOLS_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Performance\r\n Tools\r\n SYSTEM_DEVELOPER_RELEASENOTES_DIR = /Applications/Xcode.app/Contents/Developer/ADC Reference Library/releasenotes\r\n SYSTEM_DEVELOPER_TOOLS = /Applications/Xcode.app/Contents/Developer/Tools\r\n SYSTEM_DEVELOPER_TOOLS_DOC_DIR = /Applications/Xcode.app/Contents/Developer/ADC Reference\r\n Library/documentation/DeveloperTools\r\n SYSTEM_DEVELOPER_TOOLS_RELEASENOTES_DIR = /Applications/Xcode.app/Contents/Developer/ADC Reference\r\n Library/releasenotes/DeveloperTools\r\n SYSTEM_DEVELOPER_USR_DIR = /Applications/Xcode.app/Contents/Developer/usr\r\n SYSTEM_DEVELOPER_UTILITIES_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Utilities\r\n SYSTEM_DOCUMENTATION_DIR = /Library/Documentation\r\n SYSTEM_KEXT_INSTALL_PATH = /System/Library/Extensions\r\n SYSTEM_LIBRARY_DIR = /System/Library\r\n TAPI_VERIFY_MODE = ErrorsOnly\r\n TARGETED_DEVICE_FAMILY = 1,2\r\n TARGETNAME = Runner\r\n TARGET_BUILD_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos\r\n TARGET_NAME = Runner\r\n TARGET_TEMP_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build\r\n TEMP_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build\r\n TEMP_FILES_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build\r\n TEMP_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build\r\n TEMP_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n TOOLCHAIN_DIR = /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain\r\n TREAT_MISSING_BASELINES_AS_TEST_FAILURES = NO\r\n UID = 501\r\n UNLOCALIZED_RESOURCES_FOLDER_PATH = Runner.app\r\n UNSTRIPPED_PRODUCT = NO\r\n USER = andres\r\n USER_APPS_DIR = /Users/andres/Applications\r\n USER_LIBRARY_DIR = /Users/andres/Library\r\n USE_DYNAMIC_NO_PIC = YES\r\n USE_HEADERMAP = YES\r\n USE_HEADER_SYMLINKS = NO\r\n VALIDATE_PRODUCT = YES\r\n VALID_ARCHS = arm64 arm64e armv7 armv7s\r\n VERBOSE_PBXCP = NO\r\n VERSIONING_SYSTEM = apple-generic\r\n VERSIONPLIST_PATH = Runner.app/version.plist\r\n VERSION_INFO_BUILDER = andres\r\n VERSION_INFO_FILE = Runner_vers.c\r\n VERSION_INFO_STRING = \"@(#)PROGRAM:Runner PROJECT:Runner-1\"\r\n WRAPPER_EXTENSION = app\r\n WRAPPER_NAME = Runner.app\r\n WRAPPER_SUFFIX = .app\r\n WRAP_ASSET_PACKS_IN_SEPARATE_DIRECTORIES = NO\r\n XCODE_APP_SUPPORT_DIR = /Applications/Xcode.app/Contents/Developer/Library/Xcode\r\n XCODE_PRODUCT_BUILD_VERSION = 10G8\r\n XCODE_VERSION_ACTUAL = 1030\r\n XCODE_VERSION_MAJOR = 1000\r\n XCODE_VERSION_MINOR = 1030\r\n XPCSERVICES_FOLDER_PATH = Runner.app/XPCServices\r\n YACC = yacc\r\n arch = arm64\r\n variant = normal\r\n[ +100 ms] executing: pod --version\r\n[+1457 ms] 1.6.1\r\n[ +5 ms] Running pod install...\r\n[+1311 ms] Running pod install... (completed in 1.3s)\r\n[ ] CocoaPods' output:\r\n \u21b3\r\n[ +1 ms] Preparing\r\n \r\n Analyzing dependencies\r\n \r\n Inspecting targets to integrate\r\n Using `ARCHS` setting to build architectures of target `Pods-Runner`: (``)\r\n \r\n Finding Podfile changes\r\n A firebase_ml_vision\r\n - Flutter\r\n - cloud_firestore\r\n - firebase_core\r\n - firebase_database\r\n - flutter_full_pdf_viewer\r\n - fluttertoast\r\n - image_cropper\r\n - image_picker\r\n - path_provider\r\n - printing\r\n \r\n Fetching external sources\r\n -> Fetching podspec for `Flutter` from `.symlinks/flutter/ios`\r\n -> Fetching podspec for `cloud_firestore` from `.symlinks/plugins/cloud_firestore/ios`\r\n -> Fetching podspec for `firebase_core` from `.symlinks/plugins/firebase_core/ios`\r\n -> Fetching podspec for `firebase_database` from `.symlinks/plugins/firebase_database/ios`\r\n -> Fetching podspec for `firebase_ml_vision` from `.symlinks/plugins/firebase_ml_vision/ios`\r\n -> Fetching podspec for `flutter_full_pdf_viewer` from `.symlinks/plugins/flutter_full_pdf_viewer/ios`\r\n -> Fetching podspec for `fluttertoast` from `.symlinks/plugins/fluttertoast/ios`\r\n -> Fetching podspec for `image_cropper` from `.symlinks/plugins/image_cropper/ios`\r\n -> Fetching podspec for `image_picker` from `.symlinks/plugins/image_picker/ios`\r\n -> Fetching podspec for `path_provider` from `.symlinks/plugins/path_provider/ios`\r\n -> Fetching podspec for `printing` from `.symlinks/plugins/printing/ios`\r\n \r\n Resolving dependencies of `Podfile`\r\n [!] CocoaPods could not find compatible versions for pod \"firebase_ml_vision\":\r\n In Podfile:\r\n firebase_ml_vision (from `.symlinks/plugins/firebase_ml_vision/ios`)\r\n \r\n Specs satisfying the `firebase_ml_vision (from `.symlinks/plugins/firebase_ml_vision/ios`)` dependency were\r\nfound, but they\r\n required a higher minimum deployment target.\r\n \r\n /Library/Ruby/Gems/2.3.0/gems/molinillo-0.6.6/lib/molinillo/resolution.rb:328:in `raise_error_unless_state'\r\n /Library/Ruby/Gems/2.3.0/gems/molinillo-0.6.6/lib/molinillo/resolution.rb:310:in `block in unwind_for_conflict'\r\n /Library/Ruby/Gems/2.3.0/gems/molinillo-0.6.6/lib/molinillo/resolution.rb:308:in `tap'\r\n /Library/Ruby/Gems/2.3.0/gems/molinillo-0.6.6/lib/molinillo/resolution.rb:308:in `unwind_for_conflict'\r\n /Library/Ruby/Gems/2.3.0/gems/molinillo-0.6.6/lib/molinillo/resolution.rb:684:in `attempt_to_activate'\r\n /Library/Ruby/Gems/2.3.0/gems/molinillo-0.6.6/lib/molinillo/resolution.rb:254:in `process_topmost_state'\r\n /Library/Ruby/Gems/2.3.0/gems/molinillo-0.6.6/lib/molinillo/resolution.rb:182:in `resolve'\r\n /Library/Ruby/Gems/2.3.0/gems/molinillo-0.6.6/lib/molinillo/resolver.rb:43:in `resolve'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/resolver.rb:91:in `resolve'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/installer/analyzer.rb:909:in `block in\r\nresolve_dependencies'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/user_interface.rb:64:in `section'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/installer/analyzer.rb:907:in `resolve_dependencies'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/installer/analyzer.rb:114:in `analyze'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/installer.rb:266:in `analyze'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/installer.rb:174:in `block in resolve_dependencies'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/user_interface.rb:64:in `section'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/installer.rb:173:in `resolve_dependencies'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/installer.rb:136:in `install!'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/command/install.rb:48:in `run'\r\n /Library/Ruby/Gems/2.3.0/gems/claide-1.0.2/lib/claide/command.rb:334:in `run'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/lib/cocoapods/command.rb:52:in `run'\r\n /Library/Ruby/Gems/2.3.0/gems/cocoapods-1.6.1/bin/pod:55:in `<top (required)>'\r\n /usr/local/bin/pod:22:in `load'\r\n /usr/local/bin/pod:22:in `<main>'\r\n \r\n[ +3 ms] Error output from CocoaPods:\r\n \u21b3\r\n[ ] \r\n [!] Automatically assigning platform `ios` with version `8.0` on target `Runner` because no platform was\r\nspecified. Please\r\n specify a platform for this target in your Podfile. See\r\n`https://guides.cocoapods.org/syntax/podfile.html#platform`.\r\n \r\n[ +4 ms] Error running pod install\r\n[ +2 ms] Error launching application on iPhone X\u0280.\r\n[ +10 ms] \"flutter run\" took 19,575ms.\r\n[ ] \"flutter run\" took 19,575ms.\r\n[ +24 ms] executing: [/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Runner.xcodeproj/] /usr/bin/xcodebuild -project\r\n/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Runner.xcodeproj -target Runner -showBuildSettings\r\n[+1446 ms] Exit code 0 from: /usr/bin/xcodebuild -project\r\n/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Runner.xcodeproj -target Runner -showBuildSettings\r\n[ ] Build settings for action build and target Runner:\r\n ACTION = build\r\n AD_HOC_CODE_SIGNING_ALLOWED = NO\r\n ALTERNATE_GROUP = staff\r\n ALTERNATE_MODE = u+w,go-w,a+rX\r\n ALTERNATE_OWNER = andres\r\n ALWAYS_EMBED_SWIFT_STANDARD_LIBRARIES = YES\r\n ALWAYS_SEARCH_USER_PATHS = NO\r\n ALWAYS_USE_SEPARATE_HEADERMAPS = NO\r\n APPLE_INTERNAL_DEVELOPER_DIR = /AppleInternal/Developer\r\n APPLE_INTERNAL_DIR = /AppleInternal\r\n APPLE_INTERNAL_DOCUMENTATION_DIR = /AppleInternal/Documentation\r\n APPLE_INTERNAL_LIBRARY_DIR = /AppleInternal/Library\r\n APPLE_INTERNAL_TOOLS = /AppleInternal/Developer/Tools\r\n APPLICATION_EXTENSION_API_ONLY = NO\r\n APPLY_RULES_IN_COPY_FILES = NO\r\n ARCHS = armv7 arm64\r\n ARCHS_STANDARD = armv7 arm64\r\n ARCHS_STANDARD_32_64_BIT = armv7 arm64\r\n ARCHS_STANDARD_32_BIT = armv7\r\n ARCHS_STANDARD_64_BIT = arm64\r\n ARCHS_STANDARD_INCLUDING_64_BIT = armv7 arm64\r\n ARCHS_UNIVERSAL_IPHONE_OS = armv7 arm64\r\n ASSETCATALOG_COMPILER_APPICON_NAME = AppIcon\r\n AVAILABLE_PLATFORMS = appletvos appletvsimulator iphoneos iphonesimulator macosx watchos watchsimulator\r\n BITCODE_GENERATION_MODE = marker\r\n BUILD_ACTIVE_RESOURCES_ONLY = NO\r\n BUILD_COMPONENTS = headers build\r\n BUILD_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n BUILD_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n BUILD_STYLE = \r\n BUILD_VARIANTS = normal\r\n BUILT_PRODUCTS_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos\r\n CACHE_ROOT = /var/folders/cm/ggfqxkh17l5c5p_jcds0x26h0000gn/C/com.apple.DeveloperTools/10.3-10G8/Xcode\r\n CCHROOT = /var/folders/cm/ggfqxkh17l5c5p_jcds0x26h0000gn/C/com.apple.DeveloperTools/10.3-10G8/Xcode\r\n CHMOD = /bin/chmod\r\n CHOWN = /usr/sbin/chown\r\n CLANG_ANALYZER_NONNULL = YES\r\n CLANG_CXX_LANGUAGE_STANDARD = gnu++0x\r\n CLANG_CXX_LIBRARY = libc++\r\n CLANG_ENABLE_MODULES = YES\r\n CLANG_ENABLE_OBJC_ARC = YES\r\n CLANG_WARN_BLOCK_CAPTURE_AUTORELEASING = YES\r\n CLANG_WARN_BOOL_CONVERSION = YES\r\n CLANG_WARN_COMMA = YES\r\n CLANG_WARN_CONSTANT_CONVERSION = YES\r\n CLANG_WARN_DIRECT_OBJC_ISA_USAGE = YES_ERROR\r\n CLANG_WARN_EMPTY_BODY = YES\r\n CLANG_WARN_ENUM_CONVERSION = YES\r\n CLANG_WARN_INFINITE_RECURSION = YES\r\n CLANG_WARN_INT_CONVERSION = YES\r\n CLANG_WARN_NON_LITERAL_NULL_CONVERSION = YES\r\n CLANG_WARN_OBJC_LITERAL_CONVERSION = YES\r\n CLANG_WARN_OBJC_ROOT_CLASS = YES_ERROR\r\n CLANG_WARN_RANGE_LOOP_ANALYSIS = YES\r\n CLANG_WARN_STRICT_PROTOTYPES = YES\r\n CLANG_WARN_SUSPICIOUS_MOVE = YES\r\n CLANG_WARN_UNREACHABLE_CODE = YES\r\n CLANG_WARN__DUPLICATE_METHOD_MATCH = YES\r\n CLASS_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/JavaClass\r\n es\r\n CLEAN_PRECOMPS = YES\r\n CLONE_HEADERS = NO\r\n CODESIGNING_FOLDER_PATH =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/Runner.app\r\n CODE_SIGNING_ALLOWED = YES\r\n CODE_SIGNING_REQUIRED = YES\r\n CODE_SIGN_CONTEXT_CLASS = XCiPhoneOSCodeSignContext\r\n CODE_SIGN_IDENTITY = iPhone Developer\r\n CODE_SIGN_INJECT_BASE_ENTITLEMENTS = YES\r\n COLOR_DIAGNOSTICS = NO\r\n COMBINE_HIDPI_IMAGES = NO\r\n COMPILER_INDEX_STORE_ENABLE = Default\r\n COMPOSITE_SDK_DIRS = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/CompositeSDKs\r\n COMPRESS_PNG_FILES = YES\r\n CONFIGURATION = Release\r\n CONFIGURATION_BUILD_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos\r\n CONFIGURATION_TEMP_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos\r\n CONTENTS_FOLDER_PATH = Runner.app\r\n COPYING_PRESERVES_HFS_DATA = NO\r\n COPY_HEADERS_RUN_UNIFDEF = NO\r\n COPY_PHASE_STRIP = NO\r\n COPY_RESOURCES_FROM_STATIC_FRAMEWORKS = YES\r\n CORRESPONDING_SIMULATOR_PLATFORM_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform\r\n CORRESPONDING_SIMULATOR_PLATFORM_NAME = iphonesimulator\r\n CORRESPONDING_SIMULATOR_SDK_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneSimulator.platform/Developer/SDKs/iPhoneSimulator12.4.\r\n sdk\r\n CORRESPONDING_SIMULATOR_SDK_NAME = iphonesimulator12.4\r\n CP = /bin/cp\r\n CREATE_INFOPLIST_SECTION_IN_BINARY = NO\r\n CURRENT_ARCH = arm64\r\n CURRENT_PROJECT_VERSION = 1\r\n CURRENT_VARIANT = normal\r\n DEAD_CODE_STRIPPING = YES\r\n DEBUGGING_SYMBOLS = YES\r\n DEBUG_INFORMATION_FORMAT = dwarf-with-dsym\r\n DEFAULT_COMPILER = com.apple.compilers.llvm.clang.1_0\r\n DEFAULT_KEXT_INSTALL_PATH = /System/Library/Extensions\r\n DEFINES_MODULE = NO\r\n DEPLOYMENT_LOCATION = NO\r\n DEPLOYMENT_POSTPROCESSING = NO\r\n DEPLOYMENT_TARGET_CLANG_ENV_NAME = IPHONEOS_DEPLOYMENT_TARGET\r\n DEPLOYMENT_TARGET_CLANG_FLAG_NAME = miphoneos-version-min\r\n DEPLOYMENT_TARGET_CLANG_FLAG_PREFIX = -miphoneos-version-min=\r\n DEPLOYMENT_TARGET_LD_ENV_NAME = IPHONEOS_DEPLOYMENT_TARGET\r\n DEPLOYMENT_TARGET_LD_FLAG_NAME = ios_version_min\r\n DEPLOYMENT_TARGET_SETTING_NAME = IPHONEOS_DEPLOYMENT_TARGET\r\n DEPLOYMENT_TARGET_SUGGESTED_VALUES = 8.0 8.1 8.2 8.3 8.4 9.0 9.1 9.2 9.3 10.0 10.1 10.2 10.3 11.0 11.1 11.2 11.3\r\n 11.4 12.0 12.1 12.2 12.3 12.4\r\n DERIVED_FILES_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/DerivedSo\r\n urces\r\n DERIVED_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/DerivedSo\r\n urces\r\n DERIVED_SOURCES_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/DerivedSo\r\n urces\r\n DEVELOPER_APPLICATIONS_DIR = /Applications/Xcode.app/Contents/Developer/Applications\r\n DEVELOPER_BIN_DIR = /Applications/Xcode.app/Contents/Developer/usr/bin\r\n DEVELOPER_DIR = /Applications/Xcode.app/Contents/Developer\r\n DEVELOPER_FRAMEWORKS_DIR = /Applications/Xcode.app/Contents/Developer/Library/Frameworks\r\n DEVELOPER_FRAMEWORKS_DIR_QUOTED = /Applications/Xcode.app/Contents/Developer/Library/Frameworks\r\n DEVELOPER_LIBRARY_DIR = /Applications/Xcode.app/Contents/Developer/Library\r\n DEVELOPER_SDK_DIR = /Applications/Xcode.app/Contents/Developer/Platforms/MacOSX.platform/Developer/SDKs\r\n DEVELOPER_TOOLS_DIR = /Applications/Xcode.app/Contents/Developer/Tools\r\n DEVELOPER_USR_DIR = /Applications/Xcode.app/Contents/Developer/usr\r\n DEVELOPMENT_LANGUAGE = English\r\n DOCUMENTATION_FOLDER_PATH = Runner.app/English.lproj/Documentation\r\n DO_HEADER_SCANNING_IN_JAM = NO\r\n DSTROOT = /tmp/Runner.dst\r\n DT_TOOLCHAIN_DIR = /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain\r\n DWARF_DSYM_FILE_NAME = Runner.app.dSYM\r\n DWARF_DSYM_FILE_SHOULD_ACCOMPANY_PRODUCT = NO\r\n DWARF_DSYM_FOLDER_PATH = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos\r\n EFFECTIVE_PLATFORM_NAME = -iphoneos\r\n EMBEDDED_CONTENT_CONTAINS_SWIFT = YES\r\n EMBEDDED_PROFILE_NAME = embedded.mobileprovision\r\n EMBED_ASSET_PACKS_IN_PRODUCT_BUNDLE = NO\r\n ENABLE_BITCODE = NO\r\n ENABLE_DEFAULT_HEADER_SEARCH_PATHS = YES\r\n ENABLE_HEADER_DEPENDENCIES = YES\r\n ENABLE_NS_ASSERTIONS = NO\r\n ENABLE_ON_DEMAND_RESOURCES = YES\r\n ENABLE_STRICT_OBJC_MSGSEND = YES\r\n ENABLE_TESTABILITY = NO\r\n ENTITLEMENTS_ALLOWED = YES\r\n ENTITLEMENTS_DESTINATION = Signature\r\n ENTITLEMENTS_REQUIRED = YES\r\n EXCLUDED_INSTALLSRC_SUBDIRECTORY_PATTERNS = .DS_Store .svn .git .hg CVS\r\n EXCLUDED_RECURSIVE_SEARCH_PATH_SUBDIRECTORIES = *.nib *.lproj *.framework *.gch *.xcode* *.xcassets (*) .DS_Store\r\n CVS .svn .git .hg *.pbproj *.pbxproj\r\n EXECUTABLES_FOLDER_PATH = Runner.app/Executables\r\n EXECUTABLE_FOLDER_PATH = Runner.app\r\n EXECUTABLE_NAME = Runner\r\n EXECUTABLE_PATH = Runner.app/Runner\r\n EXPANDED_CODE_SIGN_IDENTITY = \r\n EXPANDED_CODE_SIGN_IDENTITY_NAME = \r\n EXPANDED_PROVISIONING_PROFILE = \r\n FILE_LIST =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/Objects/L\r\n inkFileList\r\n FIXED_FILES_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/FixedFile\r\n s\r\n FLUTTER_APPLICATION_PATH = /Users/andres/AndroidStudioProjects/appsolidariav2\r\n FLUTTER_BUILD_DIR = build\r\n FLUTTER_BUILD_NAME = 1.0.0\r\n FLUTTER_BUILD_NUMBER = 1\r\n FLUTTER_FRAMEWORK_DIR = /Users/andres/Documents/Development/flutter/bin/cache/artifacts/engine/ios\r\n FLUTTER_ROOT = /Users/andres/Documents/Development/flutter\r\n FLUTTER_TARGET = /Users/andres/AndroidStudioProjects/appsolidariav2/lib/main.dart\r\n FRAMEWORKS_FOLDER_PATH = Runner.app/Frameworks\r\n FRAMEWORK_FLAG_PREFIX = -framework\r\n FRAMEWORK_SEARCH_PATHS =\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/BoringSSL-GRPC\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseCore\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseDatabase\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseFirestore\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseInstanceID\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/GoogleUtilities\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/Protobuf\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/TOCropViewController\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/cloud_firestore\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_core\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_database\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/flutter_full_pdf_viewer\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/fluttertoast\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-C++\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-Core\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_cropper\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_picker\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/leveldb-library\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/nanopb\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/path_provider\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/printing\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/../.symlinks/flutter/ios\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/FirebaseAnalytics/Frameworks\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/GoogleAppMeasurement/Frameworks\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/BoringSSL-GRPC\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseCore\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseDatabase\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseFirestore\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseInstanceID\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/GoogleUtilities\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/Protobuf\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/TOCropViewController\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/cloud_firestore\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_core\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_database\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/flutter_full_pdf_viewer\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/fluttertoast\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-C++\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-Core\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_cropper\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_picker\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/leveldb-library\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/nanopb\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/path_provider\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/printing\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/../.symlinks/flutter/ios\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/FirebaseAnalytics/Frameworks\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/GoogleAppMeasurement/Frameworks\"\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Flutter\r\n FRAMEWORK_VERSION = A\r\n FULL_PRODUCT_NAME = Runner.app\r\n GCC3_VERSION = 3.3\r\n GCC_C_LANGUAGE_STANDARD = gnu99\r\n GCC_INLINES_ARE_PRIVATE_EXTERN = YES\r\n GCC_NO_COMMON_BLOCKS = YES\r\n GCC_PFE_FILE_C_DIALECTS = c objective-c c++ objective-c++\r\n GCC_PREPROCESSOR_DEFINITIONS = COCOAPODS=1 GPB_USE_PROTOBUF_FRAMEWORK_IMPORTS=1 PB_FIELD_32BIT=1\r\n PB_NO_PACKED_STRUCTS=1 PB_ENABLE_MALLOC=1 COCOAPODS=1 COCOAPODS=1 GPB_USE_PROTOBUF_FRAMEWORK_IMPORTS=1\r\n PB_FIELD_32BIT=1 PB_NO_PACKED_STRUCTS=1 PB_ENABLE_MALLOC=1 GPB_USE_PROTOBUF_FRAMEWORK_IMPORTS=1 COCOAPODS=1\r\n GPB_USE_PROTOBUF_FRAMEWORK_IMPORTS=1 PB_FIELD_32BIT=1 PB_NO_PACKED_STRUCTS=1 PB_ENABLE_MALLOC=1 PB_FIELD_32BIT=1\r\n PB_NO_PACKED_STRUCTS=1 PB_ENABLE_MALLOC=1\r\n GCC_SYMBOLS_PRIVATE_EXTERN = YES\r\n GCC_THUMB_SUPPORT = YES\r\n GCC_TREAT_WARNINGS_AS_ERRORS = NO\r\n GCC_VERSION = com.apple.compilers.llvm.clang.1_0\r\n GCC_VERSION_IDENTIFIER = com_apple_compilers_llvm_clang_1_0\r\n GCC_WARN_64_TO_32_BIT_CONVERSION = YES\r\n GCC_WARN_ABOUT_RETURN_TYPE = YES_ERROR\r\n GCC_WARN_UNDECLARED_SELECTOR = YES\r\n GCC_WARN_UNINITIALIZED_AUTOS = YES_AGGRESSIVE\r\n GCC_WARN_UNUSED_FUNCTION = YES\r\n GCC_WARN_UNUSED_VARIABLE = YES\r\n GENERATE_MASTER_OBJECT_FILE = NO\r\n GENERATE_PKGINFO_FILE = YES\r\n GENERATE_PROFILING_CODE = NO\r\n GENERATE_TEXT_BASED_STUBS = NO\r\n GID = 20\r\n GROUP = staff\r\n HEADERMAP_INCLUDES_FLAT_ENTRIES_FOR_TARGET_BEING_BUILT = YES\r\n HEADERMAP_INCLUDES_FRAMEWORK_ENTRIES_FOR_ALL_PRODUCT_TYPES = YES\r\n HEADERMAP_INCLUDES_NONPUBLIC_NONPRIVATE_HEADERS = YES\r\n HEADERMAP_INCLUDES_PROJECT_HEADERS = YES\r\n HEADERMAP_USES_FRAMEWORK_PREFIX_ENTRIES = YES\r\n HEADERMAP_USES_VFS = NO\r\n HEADER_SEARCH_PATHS =\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/BoringSSL-GRPC/openssl_grpc.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseCore/FirebaseCore.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseDatabase/FirebaseDatabase.\r\n framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseFirestore/FirebaseFirestor\r\n e.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseInstanceID/FirebaseInstanc\r\n eID.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/GoogleUtilities/GoogleUtilities.fr\r\n amework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/Protobuf/Protobuf.framework/Header\r\n s\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/TOCropViewController/TOCropViewCon\r\n troller.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/cloud_firestore/cloud_firestore.fr\r\n amework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_core/firebase_core.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_database/firebase_databas\r\n e.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/flutter_full_pdf_viewer/flutter_fu\r\n ll_pdf_viewer.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/fluttertoast/fluttertoast.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-C++/grpcpp.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-Core/grpc.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_cropper/image_cropper.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_picker/image_picker.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/leveldb-library/leveldb.framework/\r\n Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/nanopb/nanopb.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/path_provider/path_provider.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/printing/printing.framework/Header\r\n s\" \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public/Firebase\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public/FirebaseAuthInterop\"\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Firebase/CoreOnly/Sources\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/BoringSSL-GRPC/openssl_grpc.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseCore/FirebaseCore.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseDatabase/FirebaseDatabase.\r\n framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseFirestore/FirebaseFirestor\r\n e.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseInstanceID/FirebaseInstanc\r\n eID.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/GoogleUtilities/GoogleUtilities.fr\r\n amework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/Protobuf/Protobuf.framework/Header\r\n s\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/TOCropViewController/TOCropViewCon\r\n troller.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/cloud_firestore/cloud_firestore.fr\r\n amework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_core/firebase_core.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_database/firebase_databas\r\n e.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/flutter_full_pdf_viewer/flutter_fu\r\n ll_pdf_viewer.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/fluttertoast/fluttertoast.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-C++/grpcpp.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-Core/grpc.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_cropper/image_cropper.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_picker/image_picker.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/leveldb-library/leveldb.framework/\r\n Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/nanopb/nanopb.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/path_provider/path_provider.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/printing/printing.framework/Header\r\n s\" \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public/Firebase\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public/FirebaseAuthInterop\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/BoringSSL-GRPC/openssl_grpc.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseCore/FirebaseCore.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseDatabase/FirebaseDatabase.\r\n framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseFirestore/FirebaseFirestor\r\n e.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/FirebaseInstanceID/FirebaseInstanc\r\n eID.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/GoogleUtilities/GoogleUtilities.fr\r\n amework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/Protobuf/Protobuf.framework/Header\r\n s\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/TOCropViewController/TOCropViewCon\r\n troller.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/cloud_firestore/cloud_firestore.fr\r\n amework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_core/firebase_core.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/firebase_database/firebase_databas\r\n e.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/flutter_full_pdf_viewer/flutter_fu\r\n ll_pdf_viewer.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/fluttertoast/fluttertoast.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-C++/grpcpp.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/gRPC-Core/grpc.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_cropper/image_cropper.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/image_picker/image_picker.framewor\r\n k/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/leveldb-library/leveldb.framework/\r\n Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/nanopb/nanopb.framework/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/path_provider/path_provider.framew\r\n ork/Headers\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/printing/printing.framework/Header\r\n s\" \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public/Firebase\"\r\n \"/Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Headers/Public/FirebaseAuthInterop\"\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Firebase/CoreOnly/Sources\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods/Firebase/CoreOnly/Sources\r\n HIDE_BITCODE_SYMBOLS = YES\r\n HOME = /Users/andres\r\n ICONV = /usr/bin/iconv\r\n INFOPLIST_EXPAND_BUILD_SETTINGS = YES\r\n INFOPLIST_FILE = Runner/Info.plist\r\n INFOPLIST_OUTPUT_FORMAT = binary\r\n INFOPLIST_PATH = Runner.app/Info.plist\r\n INFOPLIST_PREPROCESS = NO\r\n INFOSTRINGS_PATH = Runner.app/English.lproj/InfoPlist.strings\r\n INLINE_PRIVATE_FRAMEWORKS = NO\r\n INSTALLHDRS_COPY_PHASE = NO\r\n INSTALLHDRS_SCRIPT_PHASE = NO\r\n INSTALL_DIR = /tmp/Runner.dst/Applications\r\n INSTALL_GROUP = staff\r\n INSTALL_MODE_FLAG = u+w,go-w,a+rX\r\n INSTALL_OWNER = andres\r\n INSTALL_PATH = /Applications\r\n INSTALL_ROOT = /tmp/Runner.dst\r\n IPHONEOS_DEPLOYMENT_TARGET = 8.0\r\n JAVAC_DEFAULT_FLAGS = -J-Xms64m -J-XX:NewSize=4M -J-Dfile.encoding=UTF8\r\n JAVA_APP_STUB = /System/Library/Frameworks/JavaVM.framework/Resources/MacOS/JavaApplicationStub\r\n JAVA_ARCHIVE_CLASSES = YES\r\n JAVA_ARCHIVE_TYPE = JAR\r\n JAVA_COMPILER = /usr/bin/javac\r\n JAVA_FOLDER_PATH = Runner.app/Java\r\n JAVA_FRAMEWORK_RESOURCES_DIRS = Resources\r\n JAVA_JAR_FLAGS = cv\r\n JAVA_SOURCE_SUBDIR = .\r\n JAVA_USE_DEPENDENCIES = YES\r\n JAVA_ZIP_FLAGS = -urg\r\n JIKES_DEFAULT_FLAGS = +E +OLDCSO\r\n KASAN_DEFAULT_CFLAGS = -DKASAN=1 -fsanitize=address -mllvm -asan-globals-live-support -mllvm\r\n -asan-force-dynamic-shadow\r\n KEEP_PRIVATE_EXTERNS = NO\r\n LD_DEPENDENCY_INFO_FILE =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/Objects-n\r\n ormal/arm64/Runner_dependency_info.dat\r\n LD_GENERATE_MAP_FILE = NO\r\n LD_MAP_FILE_PATH =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/Runner-Li\r\n nkMap-normal-arm64.txt\r\n LD_NO_PIE = NO\r\n LD_QUOTE_LINKER_ARGUMENTS_FOR_COMPILER_DRIVER = YES\r\n LD_RUNPATH_SEARCH_PATHS = '@executable_path/Frameworks' '@loader_path/Frameworks' '@executable_path/Frameworks'\r\n '@loader_path/Frameworks' @executable_path/Frameworks\r\n LEGACY_DEVELOPER_DIR =\r\n /Applications/Xcode.app/Contents/PlugIns/Xcode3Core.ideplugin/Contents/SharedSupport/Developer\r\n LEX = lex\r\n LIBRARY_FLAG_NOSPACE = YES\r\n LIBRARY_FLAG_PREFIX = -l\r\n LIBRARY_KEXT_INSTALL_PATH = /Library/Extensions\r\n LIBRARY_SEARCH_PATHS = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Flutter\r\n LINKER_DISPLAYS_MANGLED_NAMES = NO\r\n LINK_FILE_LIST_normal_arm64 = \r\n LINK_FILE_LIST_normal_armv7 = \r\n LINK_WITH_STANDARD_LIBRARIES = YES\r\n LOCALIZABLE_CONTENT_DIR = \r\n LOCALIZED_RESOURCES_FOLDER_PATH = Runner.app/English.lproj\r\n LOCALIZED_STRING_MACRO_NAMES = NSLocalizedString CFLocalizedString\r\n LOCAL_ADMIN_APPS_DIR = /Applications/Utilities\r\n LOCAL_APPS_DIR = /Applications\r\n LOCAL_DEVELOPER_DIR = /Library/Developer\r\n LOCAL_LIBRARY_DIR = /Library\r\n LOCROOT = \r\n LOCSYMROOT = \r\n MACH_O_TYPE = mh_execute\r\n MAC_OS_X_PRODUCT_BUILD_VERSION = 18F132\r\n MAC_OS_X_VERSION_ACTUAL = 101405\r\n MAC_OS_X_VERSION_MAJOR = 101400\r\n MAC_OS_X_VERSION_MINOR = 1405\r\n METAL_LIBRARY_FILE_BASE = default\r\n METAL_LIBRARY_OUTPUT_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/Runner.app\r\n MODULE_CACHE_DIR = /Users/andres/Library/Developer/Xcode/DerivedData/ModuleCache.noindex\r\n MTL_ENABLE_DEBUG_INFO = NO\r\n NATIVE_ARCH = armv7\r\n NATIVE_ARCH_32_BIT = i386\r\n NATIVE_ARCH_64_BIT = x86_64\r\n NATIVE_ARCH_ACTUAL = x86_64\r\n NO_COMMON = YES\r\n OBJECT_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/Objects\r\n OBJECT_FILE_DIR_normal =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/Objects-n\r\n ormal\r\n OBJROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n ONLY_ACTIVE_ARCH = NO\r\n OS = MACOS\r\n OSAC = /usr/bin/osacompile\r\n OTHER_LDFLAGS = -ObjC -l\"c++\" -l\"icucore\" -l\"sqlite3\" -l\"z\" -framework \"CFNetwork\" -framework\r\n \"FIRAnalyticsConnector\" -framework \"FirebaseAnalytics\" -framework \"FirebaseCore\" -framework\r\n \"FirebaseCoreDiagnostics\" -framework \"FirebaseDatabase\" -framework \"FirebaseFirestore\" -framework\r\n \"FirebaseInstanceID\" -framework \"Flutter\" -framework \"Foundation\" -framework \"GoogleAppMeasurement\" -framework\r\n \"GoogleUtilities\" -framework \"MobileCoreServices\" -framework \"Protobuf\" -framework \"Security\" -framework\r\n \"StoreKit\" -framework \"SystemConfiguration\" -framework \"TOCropViewController\" -framework \"cloud_firestore\"\r\n -framework \"firebase_core\" -framework \"firebase_database\" -framework \"flutter_full_pdf_viewer\" -framework\r\n \"fluttertoast\" -framework \"grpc\" -framework \"grpcpp\" -framework \"image_cropper\" -framework \"image_picker\"\r\n -framework \"leveldb\" -framework \"nanopb\" -framework \"openssl_grpc\" -framework \"path_provider\" -framework\r\n \"printing\" -ObjC -l\"c++\" -l\"icucore\" -l\"sqlite3\" -l\"z\" -framework \"CFNetwork\" -framework \"FIRAnalyticsConnector\"\r\n -framework \"FirebaseAnalytics\" -framework \"FirebaseCore\" -framework \"FirebaseCoreDiagnostics\" -framework\r\n \"FirebaseDatabase\" -framework \"FirebaseFirestore\" -framework \"FirebaseInstanceID\" -framework \"Flutter\" -framework\r\n \"Foundation\" -framework \"GoogleAppMeasurement\" -framework \"GoogleUtilities\" -framework \"MobileCoreServices\"\r\n -framework \"Protobuf\" -framework \"Security\" -framework \"StoreKit\" -framework \"SystemConfiguration\" -framework\r\n \"TOCropViewController\" -framework \"cloud_firestore\" -framework \"firebase_core\" -framework \"firebase_database\"\r\n -framework \"flutter_full_pdf_viewer\" -framework \"fluttertoast\" -framework \"grpc\" -framework \"grpcpp\" -framework\r\n \"image_cropper\" -framework \"image_picker\" -framework \"leveldb\" -framework \"nanopb\" -framework \"openssl_grpc\"\r\n -framework \"path_provider\" -framework \"printing\"\r\n OTHER_SWIFT_FLAGS = -D COCOAPODS -D COCOAPODS\r\n PACKAGE_TYPE = com.apple.package-type.wrapper.application\r\n PASCAL_STRINGS = YES\r\n PATH =\r\n /Applications/Xcode.app/Contents/Developer/usr/bin:/usr/local/opt/node@10/bin:/usr/local/bin:/usr/bin:/bin:/usr/s\r\n bin:/sbin:/Users/andres/Documents/Development/flutter/bin\r\n PATH_PREFIXES_EXCLUDED_FROM_HEADER_DEPENDENCIES = /usr/include /usr/local/include /System/Library/Frameworks\r\n /System/Library/PrivateFrameworks /Applications/Xcode.app/Contents/Developer/Headers\r\n /Applications/Xcode.app/Contents/Developer/SDKs /Applications/Xcode.app/Contents/Developer/Platforms\r\n PBDEVELOPMENTPLIST_PATH = Runner.app/pbdevelopment.plist\r\n PFE_FILE_C_DIALECTS = objective-c\r\n PKGINFO_FILE_PATH =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/PkgInfo\r\n PKGINFO_PATH = Runner.app/PkgInfo\r\n PLATFORM_DEVELOPER_APPLICATIONS_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/Applications\r\n PLATFORM_DEVELOPER_BIN_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr/bin\r\n PLATFORM_DEVELOPER_LIBRARY_DIR =\r\n /Applications/Xcode.app/Contents/PlugIns/Xcode3Core.ideplugin/Contents/SharedSupport/Developer/Library\r\n PLATFORM_DEVELOPER_SDK_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs\r\n PLATFORM_DEVELOPER_TOOLS_DIR =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/Tools\r\n PLATFORM_DEVELOPER_USR_DIR = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/usr\r\n PLATFORM_DIR = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform\r\n PLATFORM_DISPLAY_NAME = iOS\r\n PLATFORM_NAME = iphoneos\r\n PLATFORM_PREFERRED_ARCH = arm64\r\n PLATFORM_PRODUCT_BUILD_VERSION = 16G73\r\n PLIST_FILE_OUTPUT_FORMAT = binary\r\n PLUGINS_FOLDER_PATH = Runner.app/PlugIns\r\n PODS_BUILD_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n PODS_CONFIGURATION_BUILD_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos\r\n PODS_PODFILE_DIR_PATH = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/.\r\n PODS_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Pods\r\n PRECOMPS_INCLUDE_HEADERS_FROM_BUILT_PRODUCTS_DIR = YES\r\n PRECOMP_DESTINATION_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/PrefixHea\r\n ders\r\n PRESERVE_DEAD_CODE_INITS_AND_TERMS = NO\r\n PRIVATE_HEADERS_FOLDER_PATH = Runner.app/PrivateHeaders\r\n PRODUCT_BUNDLE_IDENTIFIER = co.com.creece.appsolidariav2\r\n PRODUCT_MODULE_NAME = Runner\r\n PRODUCT_NAME = Runner\r\n PRODUCT_SETTINGS_PATH = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Runner/Info.plist\r\n PRODUCT_TYPE = com.apple.product-type.application\r\n PROFILING_CODE = NO\r\n PROJECT = Runner\r\n PROJECT_DERIVED_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/DerivedSources\r\n PROJECT_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/ios\r\n PROJECT_FILE_PATH = /Users/andres/AndroidStudioProjects/appsolidariav2/ios/Runner.xcodeproj\r\n PROJECT_NAME = Runner\r\n PROJECT_TEMP_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build\r\n PROJECT_TEMP_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n PROVISIONING_PROFILE_REQUIRED = YES\r\n PUBLIC_HEADERS_FOLDER_PATH = Runner.app/Headers\r\n RECURSIVE_SEARCH_PATHS_FOLLOW_SYMLINKS = YES\r\n REMOVE_CVS_FROM_RESOURCES = YES\r\n REMOVE_GIT_FROM_RESOURCES = YES\r\n REMOVE_HEADERS_FROM_EMBEDDED_BUNDLES = YES\r\n REMOVE_HG_FROM_RESOURCES = YES\r\n REMOVE_SVN_FROM_RESOURCES = YES\r\n RESOURCE_RULES_REQUIRED = YES\r\n REZ_COLLECTOR_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/ResourceM\r\n anagerResources\r\n REZ_OBJECTS_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build/ResourceM\r\n anagerResources/Objects\r\n SCAN_ALL_SOURCE_FILES_FOR_INCLUDES = NO\r\n SCRIPTS_FOLDER_PATH = Runner.app/Scripts\r\n SDKROOT = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS12.4.sdk\r\n SDK_DIR = /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS12.4.sdk\r\n SDK_DIR_iphoneos12_4 =\r\n /Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS12.4.sdk\r\n SDK_NAME = iphoneos12.4\r\n SDK_NAMES = iphoneos12.4\r\n SDK_PRODUCT_BUILD_VERSION = 16G73\r\n SDK_VERSION = 12.4\r\n SDK_VERSION_ACTUAL = 120400\r\n SDK_VERSION_MAJOR = 120000\r\n SDK_VERSION_MINOR = 400\r\n SED = /usr/bin/sed\r\n SEPARATE_STRIP = NO\r\n SEPARATE_SYMBOL_EDIT = NO\r\n SET_DIR_MODE_OWNER_GROUP = YES\r\n SET_FILE_MODE_OWNER_GROUP = NO\r\n SHALLOW_BUNDLE = YES\r\n SHARED_DERIVED_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos/DerivedSources\r\n SHARED_FRAMEWORKS_FOLDER_PATH = Runner.app/SharedFrameworks\r\n SHARED_PRECOMPS_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/SharedPrecompiledHeaders\r\n SHARED_SUPPORT_FOLDER_PATH = Runner.app/SharedSupport\r\n SKIP_INSTALL = NO\r\n SOURCE_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/ios\r\n SRCROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/ios\r\n STRINGS_FILE_OUTPUT_ENCODING = binary\r\n STRIP_BITCODE_FROM_COPIED_FILES = YES\r\n STRIP_INSTALLED_PRODUCT = YES\r\n STRIP_STYLE = all\r\n STRIP_SWIFT_SYMBOLS = YES\r\n SUPPORTED_DEVICE_FAMILIES = 1,2\r\n SUPPORTED_PLATFORMS = iphonesimulator iphoneos\r\n SUPPORTS_TEXT_BASED_API = NO\r\n SWIFT_PLATFORM_TARGET_PREFIX = ios\r\n SYMROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n SYSTEM_ADMIN_APPS_DIR = /Applications/Utilities\r\n SYSTEM_APPS_DIR = /Applications\r\n SYSTEM_CORE_SERVICES_DIR = /System/Library/CoreServices\r\n SYSTEM_DEMOS_DIR = /Applications/Extras\r\n SYSTEM_DEVELOPER_APPS_DIR = /Applications/Xcode.app/Contents/Developer/Applications\r\n SYSTEM_DEVELOPER_BIN_DIR = /Applications/Xcode.app/Contents/Developer/usr/bin\r\n SYSTEM_DEVELOPER_DEMOS_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Utilities/Built Examples\r\n SYSTEM_DEVELOPER_DIR = /Applications/Xcode.app/Contents/Developer\r\n SYSTEM_DEVELOPER_DOC_DIR = /Applications/Xcode.app/Contents/Developer/ADC Reference Library\r\n SYSTEM_DEVELOPER_GRAPHICS_TOOLS_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Graphics Tools\r\n SYSTEM_DEVELOPER_JAVA_TOOLS_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Java Tools\r\n SYSTEM_DEVELOPER_PERFORMANCE_TOOLS_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Performance\r\n Tools\r\n SYSTEM_DEVELOPER_RELEASENOTES_DIR = /Applications/Xcode.app/Contents/Developer/ADC Reference Library/releasenotes\r\n SYSTEM_DEVELOPER_TOOLS = /Applications/Xcode.app/Contents/Developer/Tools\r\n SYSTEM_DEVELOPER_TOOLS_DOC_DIR = /Applications/Xcode.app/Contents/Developer/ADC Reference\r\n Library/documentation/DeveloperTools\r\n SYSTEM_DEVELOPER_TOOLS_RELEASENOTES_DIR = /Applications/Xcode.app/Contents/Developer/ADC Reference\r\n Library/releasenotes/DeveloperTools\r\n SYSTEM_DEVELOPER_USR_DIR = /Applications/Xcode.app/Contents/Developer/usr\r\n SYSTEM_DEVELOPER_UTILITIES_DIR = /Applications/Xcode.app/Contents/Developer/Applications/Utilities\r\n SYSTEM_DOCUMENTATION_DIR = /Library/Documentation\r\n SYSTEM_KEXT_INSTALL_PATH = /System/Library/Extensions\r\n SYSTEM_LIBRARY_DIR = /System/Library\r\n TAPI_VERIFY_MODE = ErrorsOnly\r\n TARGETED_DEVICE_FAMILY = 1,2\r\n TARGETNAME = Runner\r\n TARGET_BUILD_DIR = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Release-iphoneos\r\n TARGET_NAME = Runner\r\n TARGET_TEMP_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build\r\n TEMP_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build\r\n TEMP_FILES_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build\r\n TEMP_FILE_DIR =\r\n /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios/Runner.build/Release-iphoneos/Runner.build\r\n TEMP_ROOT = /Users/andres/AndroidStudioProjects/appsolidariav2/build/ios\r\n TOOLCHAIN_DIR = /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain\r\n TREAT_MISSING_BASELINES_AS_TEST_FAILURES = NO\r\n UID = 501\r\n UNLOCALIZED_RESOURCES_FOLDER_PATH = Runner.app\r\n UNSTRIPPED_PRODUCT = NO\r\n USER = andres\r\n USER_APPS_DIR = /Users/andres/Applications\r\n USER_LIBRARY_DIR = /Users/andres/Library\r\n USE_DYNAMIC_NO_PIC = YES\r\n USE_HEADERMAP = YES\r\n USE_HEADER_SYMLINKS = NO\r\n VALIDATE_PRODUCT = YES\r\n VALID_ARCHS = arm64 arm64e armv7 armv7s\r\n VERBOSE_PBXCP = NO\r\n VERSIONING_SYSTEM = apple-generic\r\n VERSIONPLIST_PATH = Runner.app/version.plist\r\n VERSION_INFO_BUILDER = andres\r\n VERSION_INFO_FILE = Runner_vers.c\r\n VERSION_INFO_STRING = \"@(#)PROGRAM:Runner PROJECT:Runner-1\"\r\n WRAPPER_EXTENSION = app\r\n WRAPPER_NAME = Runner.app\r\n WRAPPER_SUFFIX = .app\r\n WRAP_ASSET_PACKS_IN_SEPARATE_DIRECTORIES = NO\r\n XCODE_APP_SUPPORT_DIR = /Applications/Xcode.app/Contents/Developer/Library/Xcode\r\n XCODE_PRODUCT_BUILD_VERSION = 10G8\r\n XCODE_VERSION_ACTUAL = 1030\r\n XCODE_VERSION_MAJOR = 1000\r\n XCODE_VERSION_MINOR = 1030\r\n XPCSERVICES_FOLDER_PATH = Runner.app/XPCServices\r\n YACC = yacc\r\n arch = arm64\r\n variant = normal\r\n\r\n#0 throwToolExit (package:flutter_tools/src/base/common.dart:28:3)\r\n#1 RunCommand.runCommand (package:flutter_tools/src/commands/run.dart:475:7)\r\n<asynchronous suspension>\r\n#2 FlutterCommand.verifyThenRunCommand (package:flutter_tools/src/runner/flutter_command.dart:478:18)\r\n<asynchronous suspension>\r\n#3 FlutterCommand.run.<anonymous closure> (package:flutter_tools/src/runner/flutter_command.dart:383:33)\r\n<asynchronous suspension>\r\n#4 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:29)\r\n<asynchronous suspension>\r\n#5 _rootRun (dart:async/zone.dart:1124:13)\r\n#6 _CustomZone.run (dart:async/zone.dart:1021:19)\r\n#7 _runZoned (dart:async/zone.dart:1516:10)\r\n#8 runZoned (dart:async/zone.dart:1463:12)\r\n#9 AppContext.run (package:flutter_tools/src/base/context.dart:152:18)\r\n<asynchronous suspension>\r\n#10 FlutterCommand.run (package:flutter_tools/src/runner/flutter_command.dart:375:20)\r\n#11 CommandRunner.runCommand (package:args/command_runner.dart:197:27)\r\n<asynchronous suspension>\r\n#12 FlutterCommandRunner.runCommand.<anonymous closure>\r\n(package:flutter_tools/src/runner/flutter_command_runner.dart:396:21)\r\n<asynchronous suspension>\r\n#13 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:29)\r\n<asynchronous suspension>\r\n#14 _rootRun (dart:async/zone.dart:1124:13)\r\n#15 _CustomZone.run (dart:async/zone.dart:1021:19)\r\n#16 _runZoned (dart:async/zone.dart:1516:10)\r\n#17 runZoned (dart:async/zone.dart:1463:12)\r\n#18 AppContext.run (package:flutter_tools/src/base/context.dart:152:18)\r\n<asynchronous suspension>\r\n#19 FlutterCommandRunner.runCommand (package:flutter_tools/src/runner/flutter_command_runner.dart:356:19)\r\n<asynchronous suspension>\r\n#20 CommandRunner.run.<anonymous closure> (package:args/command_runner.dart:112:25)\r\n#21 new Future.sync (dart:async/future.dart:224:31)\r\n#22 CommandRunner.run (package:args/command_runner.dart:112:14)\r\n#23 FlutterCommandRunner.run (package:flutter_tools/src/runner/flutter_command_runner.dart:242:18)\r\n#24 run.<anonymous closure>.<anonymous closure> (package:flutter_tools/runner.dart:62:22)\r\n<asynchronous suspension>\r\n#25 _rootRun (dart:async/zone.dart:1124:13)\r\n#26 _CustomZone.run (dart:async/zone.dart:1021:19)\r\n#27 _runZoned (dart:async/zone.dart:1516:10)\r\n#28 runZoned (dart:async/zone.dart:1500:12)\r\n#29 run.<anonymous closure> (package:flutter_tools/runner.dart:60:18)\r\n<asynchronous suspension>\r\n#30 AppContext.run.<anonymous closure> (package:flutter_tools/src/base/context.dart:153:29)\r\n<asynchronous suspension>\r\n#31 _rootRun (dart:async/zone.dart:1124:13)\r\n#32 _CustomZone.run (dart:async/zone.dart:1021:19)\r\n#33 _runZoned (dart:async/zone.dart:1516:10)\r\n#34 runZoned (dart:async/zone.dart:1463:12)\r\n#35 AppContext.run (package:flutter_tools/src/base/context.dart:152:18)\r\n<asynchronous suspension>\r\n#36 runInContext (package:flutter_tools/src/context_runner.dart:56:24)\r\n<asynchronous suspension>\r\n#37 run (package:flutter_tools/runner.dart:51:10)\r\n#38 main (package:flutter_tools/executable.dart:62:9)\r\n<asynchronous suspension>\r\n#39 main (file:///Users/andres/Documents/Development/flutter/packages/flutter_tools/bin/flutter_tools.dart:8:3)\r\n#40 _startIsolate.<anonymous closure> (dart:isolate-patch/isolate_patch.dart:299:32)\r\n#41 _RawReceivePortImpl._handleMessage (dart:isolate-patch/isolate_patch.dart:172:12)\r\n```\r\n\r\n<!--\r\n Run `flutter analyze` and attach any output of that command below.\r\n If there are any analysis errors, try resolving them before filing this issue.\r\n-->\r\n\r\n```\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:29:7 \u2022 non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:37:7 \u2022 non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:76:14 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:83:14 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:297:10 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:298:10 \u2022\r\n non_constant_identifier_names\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/page1.dart:73:10 \u2022 must_call_super\r\n info \u2022 This function has a return type of 'void Function()', but doesn't end with a return statement \u2022\r\n lib/screens/page1.dart:275:3 \u2022 missing_return\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/page1.dart:357:10 \u2022 must_call_super\r\n info \u2022 The member 'notifyListeners' can only be used within 'package:flutter/src/foundation/change_notifier.dart' or a test \u2022\r\n lib/screens/page1.dart:512:29 \u2022 invalid_use_of_visible_for_testing_member\r\n info \u2022 The member 'notifyListeners' can only be used within instance members of subclasses of\r\n 'package:flutter/src/foundation/change_notifier.dart' \u2022 lib/screens/page1.dart:512:29 \u2022\r\n invalid_use_of_protected_member\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/page2.dart:30:10 \u2022 must_call_super\r\n info \u2022 The member 'notifyListeners' can only be used within 'package:flutter/src/foundation/change_notifier.dart' or a test \u2022\r\n lib/screens/page2.dart:62:27 \u2022 invalid_use_of_visible_for_testing_member\r\n info \u2022 The member 'notifyListeners' can only be used within instance members of subclasses of\r\n 'package:flutter/src/foundation/change_notifier.dart' \u2022 lib/screens/page2.dart:62:27 \u2022 invalid_use_of_protected_member\r\n info \u2022 The member 'notifyListeners' can only be used within 'package:flutter/src/foundation/change_notifier.dart' or a test \u2022\r\n lib/screens/page2.dart:99:29 \u2022 invalid_use_of_visible_for_testing_member\r\n info \u2022 The member 'notifyListeners' can only be used within instance members of subclasses of\r\n 'package:flutter/src/foundation/change_notifier.dart' \u2022 lib/screens/page2.dart:99:29 \u2022 invalid_use_of_protected_member\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/page2.dart:170:10 \u2022 must_call_super\r\n info \u2022 Avoid using unnecessary statements \u2022 lib/screens/pdf/document.dart:21:5 \u2022 unnecessary_statements\r\n error \u2022 The method 'addPage' isn't defined for the class 'PdfDocument' \u2022 lib/screens/pdf/document.dart:184:7 \u2022\r\n undefined_method\r\n error \u2022 The return type 'PdfDocument' isn't a 'Future<Document>', as defined by the method 'generateDocument' \u2022\r\n lib/screens/pdf/document.dart:409:10 \u2022 return_of_invalid_type\r\n info \u2022 The member 'notifyListeners' can only be used within 'package:flutter/src/foundation/change_notifier.dart' or a test \u2022\r\n lib/screens/poliza.dart:119:39 \u2022 invalid_use_of_visible_for_testing_member\r\n info \u2022 The member 'notifyListeners' can only be used within instance members of subclasses of\r\n 'package:flutter/src/foundation/change_notifier.dart' \u2022 lib/screens/poliza.dart:119:39 \u2022\r\n invalid_use_of_protected_member\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/temp.dart:31:10 \u2022 must_call_super\r\n\r\n23 issues found. (ran in 75.8s)\r\nmacs-mac-mini:appsolidariav2 andres$ flutter analyze\r\nAnalyzing appsolidariav2... \r\n\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:29:7 \u2022 non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:37:7 \u2022 non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:76:14 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:83:14 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:297:10 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/model/auxiliarModel.dart:298:10 \u2022\r\n non_constant_identifier_names\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/page1.dart:73:10 \u2022 must_call_super\r\n info \u2022 This function has a return type of 'void Function()', but doesn't end with a return statement \u2022\r\n lib/screens/page1.dart:275:3 \u2022 missing_return\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/page1.dart:357:10 \u2022 must_call_super\r\n info \u2022 The member 'notifyListeners' can only be used within 'package:flutter/src/foundation/change_notifier.dart' or a test \u2022\r\n lib/screens/page1.dart:512:29 \u2022 invalid_use_of_visible_for_testing_member\r\n info \u2022 The member 'notifyListeners' can only be used within instance members of subclasses of\r\n 'package:flutter/src/foundation/change_notifier.dart' \u2022 lib/screens/page1.dart:512:29 \u2022\r\n invalid_use_of_protected_member\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/page2.dart:30:10 \u2022 must_call_super\r\n info \u2022 The member 'notifyListeners' can only be used within 'package:flutter/src/foundation/change_notifier.dart' or a test \u2022\r\n lib/screens/page2.dart:62:27 \u2022 invalid_use_of_visible_for_testing_member\r\n info \u2022 The member 'notifyListeners' can only be used within instance members of subclasses of\r\n 'package:flutter/src/foundation/change_notifier.dart' \u2022 lib/screens/page2.dart:62:27 \u2022 invalid_use_of_protected_member\r\n info \u2022 The member 'notifyListeners' can only be used within 'package:flutter/src/foundation/change_notifier.dart' or a test \u2022\r\n lib/screens/page2.dart:99:29 \u2022 invalid_use_of_visible_for_testing_member\r\n info \u2022 The member 'notifyListeners' can only be used within instance members of subclasses of\r\n 'package:flutter/src/foundation/change_notifier.dart' \u2022 lib/screens/page2.dart:99:29 \u2022 invalid_use_of_protected_member\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/page2.dart:170:10 \u2022 must_call_super\r\n info \u2022 Avoid using unnecessary statements \u2022 lib/screens/pdf/document.dart:21:5 \u2022 unnecessary_statements\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/screens/pdf/document.dart:129:16 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/screens/pdf/document.dart:130:16 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/screens/pdf/document.dart:131:16 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/screens/pdf/document.dart:132:16 \u2022\r\n non_constant_identifier_names\r\n info \u2022 Name non-constant identifiers using lowerCamelCase \u2022 lib/screens/pdf/document.dart:133:16 \u2022\r\n non_constant_identifier_names\r\n info \u2022 The member 'notifyListeners' can only be used within 'package:flutter/src/foundation/change_notifier.dart' or a test \u2022\r\n lib/screens/poliza.dart:119:39 \u2022 invalid_use_of_visible_for_testing_member\r\n info \u2022 The member 'notifyListeners' can only be used within instance members of subclasses of\r\n 'package:flutter/src/foundation/change_notifier.dart' \u2022 lib/screens/poliza.dart:119:39 \u2022\r\n invalid_use_of_protected_member\r\n info \u2022 This method overrides a method annotated as @mustCallSuper in 'AutomaticKeepAliveClientMixin', but does not invoke the\r\n overridden method \u2022 lib/screens/temp.dart:31:10 \u2022 must_call_super\r\n\r\n26 issues found. (ran in 52.0s)\r\n\r\n```\r\n\r\n<!-- Finally, paste the output of running `flutter doctor -v` here. -->\r\n\r\n```\r\nDoctor summary (to see all details, run flutter doctor -v):\r\n[\u2713] Flutter (Channel stable, v1.7.8+hotfix.4, on Mac OS X 10.14.5 18F132, locale en-CO)\r\n \r\n[\u2713] Android toolchain - develop for Android devices (Android SDK version 28.0.3)\r\n[\u2713] Xcode - develop for iOS and macOS (Xcode 10.3)\r\n[\u2713] iOS tools - develop for iOS devices\r\n[\u2713] Android Studio (version 3.4)\r\n[\u2713] Connected device (1 available)\r\n\r\n\u2022 No issues found!\r\n\r\n```\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# User ActiveStorage instead of Paperclip\n\nSee docs here:\r\nhttps://guides.rubyonrails.org/active_storage_overview.html","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Router not reusing parent component when changing only the child route","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add me","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Include Build and Deploy Instructions","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add getting started section to README","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Usage with precommit hooks","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Rethink Select mode","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"A guy is scamming innocent people by selling your code....","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# The automated release is failing \ud83d\udea8\n\n## :rotating_light: The automated release from the `undefined` branch failed. :rotating_light:\n\nI recommend you give this issue a high priority, so other packages depending on you could benefit from your bug fixes and new features.\n\nYou can find below the list of errors reported by **semantic-release**. Each one of them has to be resolved in order to automatically publish your package. I\u2019m sure you can resolve this \ud83d\udcaa.\n\nErrors are usually caused by a misconfiguration or an authentication problem. With each error reported below you will find explanation and guidance to help you to resolve it.\n\nOnce all the errors are resolved, **semantic-release** will release your package the next time you push a commit to the `undefined` branch. You can also manually restart the failed CI job that runs **semantic-release**.\n\nIf you are not sure how to resolve this, here is some links that can help you:\n- [Usage documentation](https://github.com/semantic-release/semantic-release/blob/caribou/docs/usage/README.md)\n- [Frequently Asked Questions](https://github.com/semantic-release/semantic-release/blob/caribou/docs/support/FAQ.md)\n- [Support channels](https://github.com/semantic-release/semantic-release#get-help)\n\nIf those don\u2019t help, or if this issue is reporting something you think isn\u2019t right, you can always ask the humans behind **[semantic-release](https://github.com/semantic-release/semantic-release/issues/new)**.\n\n---\n\n### Invalid npm token.\n\nThe [npm token](https://github.com/semantic-release/npm/blob/master/README.md#npm-registry-authentication) configured in the `NPM_TOKEN` environment variable must be a valid [token](https://docs.npmjs.com/getting-started/working_with_tokens) allowing to publish to the registry `https://registry.npmjs.org/`.\n\nIf you are using Two-Factor Authentication, make configure the `auth-only` [level](https://docs.npmjs.com/getting-started/using-two-factor-authentication#levels-of-authentication) is supported. **semantic-release** cannot publish with the default `auth-and-writes` level.\n\nPlease make sure to set the `NPM_TOKEN` environment variable in your CI with the exact value of the npm token.\n\n---\n\nGood luck with your project \u2728\n\nYour **[semantic-release](https://github.com/semantic-release/semantic-release)** bot :package::rocket:\n\n<!-- semantic-release:github -->","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"How i can access to Wilderness@Next?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"NavController docs reference goToRoot(), which does not exist","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Unexpected behavior when onlyFromAutocomplete=false with focusFirstElement=true","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Missing () In The Documentation Code\n\nIn documentation sample code for [`getRandomSubmission()`](https://not-an-aardvark.github.io/snoowrap/Subreddit.html) \r\nExpected :\r\n`r.getSubreddit('snoowrap').getRandomSubmission().then(console.log)`\r\nActual : \r\n`r.getSubreddit('snoowrap').getRandomSubmission.then(console.log)`\r\n\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"403 on all methods :-(","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Not clear how to pass data to component in tests","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Environment variables not properly passed in 8.5.16","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Which version to fork for new viz development?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Compiling drone","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Translate tasks/run-application/run-stateless-application-deployment in Korean\n\n**This is a Feature Request**\r\n\r\n**What would you like to be added**\r\n/docs/tasks/run-application/run-stateless-application-deployment.md Korean Translation\r\n\r\n**Why is this needed**\r\nNo Translate tasks/run-application/run-stateless-application-deployment in Korean\r\n\r\n**Comments**\r\nCopy from en content and Update content.\r\nhttps://kubernetes.io/docs/tasks/run-application/run-stateless-application-deployment/\r\n\r\n/language ko","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Consolidate actor documentation\n\nIt seems the advice found [here](https://github.com/apple/foundationdb/blob/master/flow/actorcompiler/Actor%20checklist.txt) might be more discoverable if it were moved to flow/README.md","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"authentication_failure_logging mapping","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# IDictionary indexer gives spurious CS8625 nullability warning\n\n**Version Used**: dotnet 3.0.100-preview7-012821\r\n\r\n**Steps to Reproduce**:\r\n\r\nCompile the following code with `<Nullable>enable</Nullable>`:\r\n\r\n```csharp\r\nIDictionary dictionary = new Dictionary<string, string>();\r\ndictionary[\"test\"] = null;\r\n```\r\n\r\nFull example at https://github.com/bgrainger/NullableTest/blob/master/IDictionaryIndexer.cs\r\n\r\n**Expected Behavior**:\r\n\r\nNo warnings.\r\n\r\nAs per https://docs.microsoft.com/en-us/dotnet/api/system.collections.idictionary?view=netframework-4.8:\r\n\r\n> The value can be null and does not have to be unique.\r\n\r\nAt https://github.com/dotnet/corefx/blob/b129f7657a1b93ce9cf577d769d9d03c862e2338/src/Common/src/CoreLib/System/Collections/IDictionary.cs#L18 the indexer property is denoted as `object?`.\r\n\r\n**Actual Behavior**:\r\n\r\nwarning CS8625: Cannot convert null literal to non-nullable reference type.\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Link to repo","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Draw annotations programmatically.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Boost post","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[stdlib documentation] Only generate docs for public types","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Incorrect device parameters on SM-G930F","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Doxygen Comments for Test Case Functions","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Copy sample behat config file via Composer","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2018-11694 (High) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2018-11694 - High Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>\n<p>\n\n<p>A Java based fault and performance management system</p>\n<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nAn issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Functions::selector_append which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.\n\n<p>Publish Date: 2018-06-04\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11694>CVE-2018-11694</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11694\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11694</a></p>\n<p>Release Date: 2018-06-04</p>\n<p>Fix Resolution: 3.6.0</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Rationalize freeway alert level thresholds","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# #12 How many people actively contribute (code, documentation, other?) to the project at this time?\n\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Test failure in APITestSharded.testMultipleWaitsAndGets.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Installation","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Spring crash [103.0.1-1237-ge0f8a3d]","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update MCW with September workshop info","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update Garbage Collection Docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Add Terminology page\n\nPeople new to Event Sourcing, CQRS, Event-Driven Architecture etc. will benefit from quick overview of these terms. They can be given as pages in the overview section of the documentation area.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"[Request] Add videojs-contrib-quality-levels","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add instructions for how to build a set-up","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"rst/Sphinx documentation for *files* module","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Better documentations and how to use the app\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"When i navigate from one route to another child route internally, my sidebar component doesn't work","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# MAINTAINER & CONTRIBUTOR NEEDED\n\nHi guys!\r\n\r\nInstabot family needs **maintainers and contributors**. We need python experts to help our community of 5000+ users solve issues and fix bugs. I really want to expand our family and invite YOU to join our open-source organization.\r\n\r\nI've started this project 4 years ago as my student course project and now it becomes really **HUGE**. It is so hard to do this alone so I really appreciate any help with:\r\n\r\n1. Bug fixings\r\n2. Issue solving\r\n3. New Features creation\r\n4. Documentation enhancing \r\n\r\nYou don't need my permission to start contributing! Just create pull requests with your changes and I will check them. \r\n\r\nSee you!\r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Enhancement: Local shell provisioning like Packer\n\nSome complex provisioning setups require custom commands to be executed on the host, such as telnet commands for legacy guest operating systems. Packer provides such local shell provisioning:\r\n\r\nhttp://packer.io/docs/provisioners/shell-local.html\r\n\r\nCan Vagrant receive a similar option?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Unsigned subtraction should be signed subtraction","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Salesforce connector","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"7.0.44 hangs on startup","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Add README for the render engine\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Update Cron Jobs Docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"add instructions for standing up lab on Vagrant","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Update Documentation\n\nDocumentation needs to be updated and more detailed.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"blasr installation with CMake does not work","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Fixed height cards should clip partial content","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Getting Started with GitHub\n\n# :wave: Welcome to GitHub Learning Lab's \"Introduction to GitHub\"\n\nTo get started, I\u2019ll guide you through some important first steps in coding and collaborating on GitHub.\n\n:point_down: _This arrow means you can expand the window! Click on them throughout the course to find more information._\n<details><summary>What is GitHub?</summary>\n<hr>\n\n## What is GitHub?\n\nI'm glad you asked! Many people come to GitHub because they want to contribute to open source <sup>[:book:](https://help.github.com/articles/github-glossary/#open-source)</sup> projects, or they're invited by teammates or classmates who use it for their projects. Why do people use GitHub for these projects?\n\n**At its heart, GitHub is a collaboration platform.**\n\nFrom software to legal documents, you can count on GitHub to help you do your best work with the collaboration and security tools your team needs. With GitHub, you can keep projects completely private, invite the world to collaborate, and streamline every step of your project.\n\n**GitHub is also a powerful version control tool.**\n\nGitHub uses Git <sup>[:book:](https://help.github.com/articles/github-glossary/#git)</sup>, the most popular open source version control software, to track every contribution and contributor <sup>[:book:](https://help.github.com/articles/github-glossary/#contributor)</sup> to your project--so you know exactly where every line of code came from.\n\n**GitHub helps people do much more.**\n\nGitHub is used to build some of the most advanced technologies in the world. Whether you're visualizing data or building a new game, there's a whole community and set of tools on GitHub that can get you to the next step. This course starts with the basics, but we'll dig into the rest later!\n\n:tv: [Video: What is GitHub?](https://www.youtube.com/watch?v=w3jLJU7DT5E)\n<hr>\n</details><br>\n\n<details><summary>Exploring a GitHub repository</summary>\n<hr>\n\n## Exploring a GitHub repository\n\n:tv: [Video: Exploring a repository](https://www.youtube.com/watch?v=R8OAwrcMlRw)\n\n### More features\n\nThe video covered some of the most commonly-used features. Here are a few other items you can find in GitHub repositories:\n\n- Project boards: Create Kanban-style task tracking board within GitHub\n- Wiki: Create and store relevant project documentation\n- Insights: View a drop-down menu that contains links to analytics tools for your repository including:\n - Pulse: Find information about the work that has been completed and the work that\u2019s in-progress in this project dashboard\n - Graphs: Graphs provide a more granular view of the repository activity including who contributed to the repository, who forked it, and when they completed the work\n\n### Special Files\n\nIn the video you learned about a special file called the README.md. Here are a few other special files you can add to your repositories:\n\n- CONTRIBUTING.md: The `CONTRIBUTING.md` is used to describe the process for contributing to the repository. A link to the `CONTRIBUTING.md` file is shown anytime someone creates a new issue or pull request.\n- ISSUE_TEMPLATE.md: The `ISSUE_TEMPLATE.md` is another file you can use to pre-populate the body of an issue. For example, if you always need the same types of information for bug reports, include it in the issue template, and every new issue will be opened with your recommended starter text.\n\n<hr>\n</details>\n\n### Using issues\n\nThis is an issue <sup>[:book:](https://help.github.com/articles/github-glossary/#issue)</sup>: a place where you can have conversations about bugs in your code, code review, and just about anything else.\n\nIssue titles are like email subject lines. They tell your collaborators what the issue is about at a glance. For example, the title of this issue is Getting Started with GitHub.\n\n\n<details><summary>Using GitHub Issues</summary>\n\n## Using GitHub issues\n\nIssues are used to discuss ideas, enhancements, tasks, and bugs. They make collaboration easier by:\n\n- Providing everyone (even future team members) with the complete story in one place\n- Allowing you to cross-link to other issues and pull requests <sup>[:book:](https://help.github.com/articles/github-glossary/#pull-request)</sup>\n- Creating a single, comprehensive record of how and why you made certain decisions\n- Allowing you to easily pull the right people and teams into a conversation with @-mentions\n\n:tv: [Video: Using issues](https://www.youtube.com/watch?v=Zhj46r5D0nQ)\n\n<hr>\n</details>\n\n<details><summary>Managing notifications</summary>\n<hr>\n\n## Managing notifications\n\n:tv: [Video: Watching, notifications, stars, and explore](https://www.youtube.com/watch?v=ocQldxF7fMY)\n\nOnce you've commented on an issue or pull request, you'll start receiving email notifications when there's activity in the thread. \n\n### How to silence or unmute specific conversations\n\n1. Go to the issue or pull request\n2. Under _\"Notifications\"_, click the **Unsubscribe** button on the right to silence notifications or **Subscribe** to unmute them\n\nYou'll see a short description that explains your current notification status.\n\n### How to customize notifications in Settings\n\n1. Click your profile icon\n2. Click **Settings**\n3. Click **Notifications** from the menu on the left and [adjust your notification preferences](https://help.github.com/articles/managing-notification-delivery-methods/)\n\n### Repository notification options\n\n* **Watch**: You'll receive a notification when a new issue, pull request or comment is posted, and when an issue is closed or a pull request is merged \n* **Not watching**: You'll no longer receive notifications unless you're @-mentioned\n* **Ignore**: You'll no longer receive any notifications from the repository\n\n### How to review notifications for the repositories you're watching\n\n1. Click your profile icon\n2. Click **Settings**\n3. Click **Notification** from the menu on the left\n4. Click on the [repositories you\u2019re watching](https://github.com/watching) link\n5. Select the **Watching** tab\n6. Click the **Unwatch** button to disable notifications, or **Watch** to enable them\n\n<hr>\n</details>\n\n<hr>\n<h3 align=\"center\">Keep reading below to find your first task</h3>\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add me Please.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# [docs]: DESIGN TOKENS - How to draw observeOn vs subscribeOn?\n\nHow to draw \r\n\r\nobserveOn vs subscribeOn?\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# building latest version\n\nHello,\r\n\r\nI am trying to build the latest github version on several machines, and I am getting this error below.\r\nWould you please give me hints what could be wrong ? I assume cmake is downloading and building dependencies \"behind the scenes\".\r\n\r\n~~~\r\[email protected]:/tmp/milias-work/software/qch/mrchem_suite/mrchem_master/../setup --cxx=icpc build_icpccmake -DCMAKE_CXX_COMPILER=icpc -DEXTRA_CXXFLAGS=\"''\" -DPYTHON_INTERPRETER=\"''\" -DENABLE_MPI=False -DENABLE_OPENMP=False -DENABLE_CODE_COVERAGE=False -DCMAKE_BUILD_TYPE=release -G\"Unix Makefiles\" -H/tmp/milias-work/software/qch/mrchem_suite/mrchem_master -Bbuild_icpc\r\n\r\n-- The CXX compiler identification is Intel 17.0.4.20170411\r\n-- Check for working CXX compiler: /cvmfs/it.gsi.de/compiler/intel/17.0/compilers_and_libraries_2017.4.196/linux/bin/intel64/icpc\r\n-- Check for working CXX compiler: /cvmfs/it.gsi.de/compiler/intel/17.0/compilers_and_libraries_2017.4.196/linux/bin/intel64/icpc -- works\r\n-- Detecting CXX compiler ABI info\r\n-- Detecting CXX compiler ABI info - done\r\n-- Detecting CXX compile features\r\n-- Detecting CXX compile features - done\r\n-- Found PythonInterp: /usr/bin/python (found version \"2.7.9\") \r\n-- Found Git: /usr/bin/git (found version \"2.1.4\") \r\n-- The C compiler identification is GNU 4.9.2\r\n-- Check for working C compiler: /usr/bin/cc\r\n-- Check for working C compiler: /usr/bin/cc -- works\r\n-- Detecting C compiler ABI info\r\n-- Detecting C compiler ABI info - done\r\n-- Detecting C compile features\r\n-- Detecting C compile features - done\r\n-- Suitable XCFun could not be located. Fetching and building!\r\n-- Setting option ENABLE_CODE_COVERAGE: False\r\n-- C++ compiler flags : -Wno-unknown-pragmas -g;-wd981;-wd279;-wd383;-wd1572;-wd177;-fno-rtti;-fno-exceptions \r\n-- C compiler flags : -ffloat-store;-m64 \r\n-- Setting (unspecified) option STATIC_LIBRARY_ONLY: OFF\r\n-- Setting (unspecified) option SHARED_LIBRARY_ONLY: OFF\r\n-- Setting (unspecified) option ENABLE_GENERIC: OFF\r\n-- Setting (unspecified) option XCFun_XC_MAX_ORDER: 3\r\n-- Setting (unspecified) option PYMOD_INSTALL_LIBDIR: python\r\n-- Setting (unspecified) option ENABLE_PYTHON_INTERFACE: OFF\r\n-- Performing Test COMPILER_HAS_HIDDEN_VISIBILITY\r\n-- Performing Test COMPILER_HAS_HIDDEN_VISIBILITY - Success\r\n-- Performing Test COMPILER_HAS_HIDDEN_INLINE_VISIBILITY\r\n-- Performing Test COMPILER_HAS_HIDDEN_INLINE_VISIBILITY - Success\r\n-- Performing Test COMPILER_HAS_DEPRECATED_ATTR\r\n-- Performing Test COMPILER_HAS_DEPRECATED_ATTR - Success\r\n-- Setting (unspecified) option ENABLE_TESTALL: ON\r\n-- Suitable Eigen3 could not be located. Fetching and building!\r\n-- Performing Test EIGEN_COMPILER_SUPPORT_CPP11\r\n-- Performing Test EIGEN_COMPILER_SUPPORT_CPP11 - Success\r\n-- Performing Test COMPILER_SUPPORT_std=cpp03\r\n-- Performing Test COMPILER_SUPPORT_std=cpp03 - Failed\r\n-- Performing Test standard_math_library_linked_to_automatically\r\n-- Performing Test standard_math_library_linked_to_automatically - Success\r\n-- Standard libraries to link to explicitly: none\r\n-- Performing Test COMPILER_SUPPORT_WERROR\r\n-- Performing Test COMPILER_SUPPORT_WERROR - Success\r\n-- Performing Test COMPILER_SUPPORT_pedantic\r\n-- Performing Test COMPILER_SUPPORT_pedantic - Success\r\n-- Performing Test COMPILER_SUPPORT_Wall\r\n-- Performing Test COMPILER_SUPPORT_Wall - Success\r\n-- Performing Test COMPILER_SUPPORT_Wextra\r\n-- Performing Test COMPILER_SUPPORT_Wextra - Success\r\n-- Performing Test COMPILER_SUPPORT_Wundef\r\n-- Performing Test COMPILER_SUPPORT_Wundef - Success\r\n-- Performing Test COMPILER_SUPPORT_Wcastalign\r\n-- Performing Test COMPILER_SUPPORT_Wcastalign - Failed\r\n-- Performing Test COMPILER_SUPPORT_Wcharsubscripts\r\n-- Performing Test COMPILER_SUPPORT_Wcharsubscripts - Success\r\n-- Performing Test COMPILER_SUPPORT_Wnonvirtualdtor\r\n-- Performing Test COMPILER_SUPPORT_Wnonvirtualdtor - Success\r\n-- Performing Test COMPILER_SUPPORT_Wunusedlocaltypedefs\r\n-- Performing Test COMPILER_SUPPORT_Wunusedlocaltypedefs - Failed\r\n-- Performing Test COMPILER_SUPPORT_Wpointerarith\r\n-- Performing Test COMPILER_SUPPORT_Wpointerarith - Success\r\n-- Performing Test COMPILER_SUPPORT_Wwritestrings\r\n-- Performing Test COMPILER_SUPPORT_Wwritestrings - Success\r\n-- Performing Test COMPILER_SUPPORT_Wformatsecurity\r\n-- Performing Test COMPILER_SUPPORT_Wformatsecurity - Success\r\n-- Performing Test COMPILER_SUPPORT_Wshorten64to32\r\n-- Performing Test COMPILER_SUPPORT_Wshorten64to32 - Success\r\n-- Performing Test COMPILER_SUPPORT_Wlogicalop\r\n-- Performing Test COMPILER_SUPPORT_Wlogicalop - Failed\r\n-- Performing Test COMPILER_SUPPORT_Wenumconversion\r\n-- Performing Test COMPILER_SUPPORT_Wenumconversion - Failed\r\n-- Performing Test COMPILER_SUPPORT_Wcpp11extensions\r\n-- Performing Test COMPILER_SUPPORT_Wcpp11extensions - Failed\r\n-- Performing Test COMPILER_SUPPORT_Wdoublepromotion\r\n-- Performing Test COMPILER_SUPPORT_Wdoublepromotion - Failed\r\n-- Performing Test COMPILER_SUPPORT_Wshadow\r\n-- Performing Test COMPILER_SUPPORT_Wshadow - Success\r\n-- Performing Test COMPILER_SUPPORT_Wnopsabi\r\n-- Performing Test COMPILER_SUPPORT_Wnopsabi - Failed\r\n-- Performing Test COMPILER_SUPPORT_Wnovariadicmacros\r\n-- Performing Test COMPILER_SUPPORT_Wnovariadicmacros - Success\r\n-- Performing Test COMPILER_SUPPORT_Wnolonglong\r\n-- Performing Test COMPILER_SUPPORT_Wnolonglong - Success\r\n-- Performing Test COMPILER_SUPPORT_fnochecknew\r\n-- Performing Test COMPILER_SUPPORT_fnochecknew - Success\r\n-- Performing Test COMPILER_SUPPORT_fnocommon\r\n-- Performing Test COMPILER_SUPPORT_fnocommon - Success\r\n-- Performing Test COMPILER_SUPPORT_fstrictaliasing\r\n-- Performing Test COMPILER_SUPPORT_fstrictaliasing - Success\r\n-- Performing Test COMPILER_SUPPORT_wd981\r\n-- Performing Test COMPILER_SUPPORT_wd981 - Success\r\n-- Performing Test COMPILER_SUPPORT_wd2304\r\n-- Performing Test COMPILER_SUPPORT_wd2304 - Success\r\n-- Performing Test COMPILER_SUPPORT_STRICTANSI\r\n-- Performing Test COMPILER_SUPPORT_STRICTANSI - Success\r\n-- Performing Test COMPILER_SUPPORT_Qunusedarguments\r\n-- Performing Test COMPILER_SUPPORT_Qunusedarguments - Failed\r\n-- Performing Test COMPILER_SUPPORT_OPENMP\r\n-- Performing Test COMPILER_SUPPORT_OPENMP - Failed\r\n-- Looking for Q_WS_X11\r\n-- Looking for Q_WS_X11 - found\r\n-- Looking for Q_WS_WIN\r\n-- Looking for Q_WS_WIN - not found\r\n-- Looking for Q_WS_QWS\r\n-- Looking for Q_WS_QWS - not found\r\n-- Looking for Q_WS_MAC\r\n-- Looking for Q_WS_MAC - not found\r\n-- Found Qt4: /usr/bin/qmake-qt4 (found version \"4.8.6\") \r\n-- The Fortran compiler identification is GNU 4.9.2\r\n-- Check for working Fortran compiler: /usr/bin/gfortran\r\n-- Check for working Fortran compiler: /usr/bin/gfortran -- works\r\n-- Detecting Fortran compiler ABI info\r\n-- Detecting Fortran compiler ABI info - done\r\n-- Checking whether /usr/bin/gfortran supports Fortran 90\r\n-- Checking whether /usr/bin/gfortran supports Fortran 90 -- yes\r\n-- Found Qt4: /usr/bin/qmake-qt4 (found version \"4.8.6\") \r\n-- Found OpenGL: /usr/lib/x86_64-linux-gnu/libGL.so \r\n-- Found CHOLMOD: /usr/include/suitesparse \r\n-- Found UMFPACK: /usr/include/suitesparse \r\n-- Could NOT find SUPERLU (missing: SUPERLU_INCLUDES SUPERLU_LIBRARIES SUPERLU_VERSION_OK) \r\n-- A version of Pastix has been found but pastix_nompi.h does not exist in the include directory. Because Eigen tests require a version without MPI, we disable the Pastix backend.\r\n-- \r\n-- Configured Eigen 3.3.7\r\n-- \r\n-- Some things you can do now:\r\n-- --------------+--------------------------------------------------------------\r\n-- Command | Description\r\n-- --------------+--------------------------------------------------------------\r\n-- make install | Install Eigen. Headers will be installed to:\r\n-- | <CMAKE_INSTALL_PREFIX>/<INCLUDE_INSTALL_DIR>\r\n-- | Using the following values:\r\n-- | CMAKE_INSTALL_PREFIX: /usr/local\r\n-- | INCLUDE_INSTALL_DIR: include/eigen3\r\n-- | Change the install location of Eigen headers using:\r\n-- | cmake . -DCMAKE_INSTALL_PREFIX=yourprefix\r\n-- | Or:\r\n-- | cmake . -DINCLUDE_INSTALL_DIR=yourdir\r\n-- make doc | Generate the API documentation, requires Doxygen & LaTeX\r\n-- make check | Build and run the unit-tests. Read this page:\r\n-- | http://eigen.tuxfamily.org/index.php?title=Tests\r\n-- make blas | Build BLAS library (not the same thing as Eigen)\r\n-- make uninstall| Removes files installed by make install\r\n-- --------------+--------------------------------------------------------------\r\n-- \r\n-- Suitable MRCPP could not be located. Fetching and building!\r\n-- Setting (unspecified) option BUILD_STATIC_LIBS: OFF\r\n-- Using Eigen3: /lustre/nyx/ukt/milias/work/software/mrchem_suite/mrchem (version 3.3.7)\r\n-- Found packages:\r\n * PythonInterp\r\n * Qt4\r\n * OpenGL\r\n * Cholmod\r\n * Umfpack\r\n * Git\r\n\r\n-- Suitable nlohmann_json could not be located: downloading and building nlohmann_json instead.\r\n-- Using the single-header code from /tmp/milias-work/software/qch/mrchem_suite/mrchem_master/build_icpc/nlohmann_json_sources-src/single_include/\r\n-- Found packages:\r\n * PythonInterp\r\n * Qt4\r\n * OpenGL\r\n * Cholmod\r\n * Umfpack\r\n * Git\r\n\r\n-- Configuring incomplete, errors occurred!\r\nSee also \"/tmp/milias-work/software/qch/mrchem_suite/mrchem_master/build_icpc/CMakeFiles/CMakeOutput.log\".\r\nSee also \"/tmp/milias-work/software/qch/mrchem_suite/mrchem_master/build_icpc/CMakeFiles/CMakeError.log\".\r\n\r\nCMake Error at /lustre/nyx/ukt/milias/work/software/mrchem_suite/mrchem/build_openmpi_intel17mkl/_deps/eigen3_sources-build/Eigen3Config.cmake:20 (include):\r\n include could not find load file:\r\n\r\n /lustre/nyx/ukt/milias/work/software/mrchem_suite/mrchem/build_openmpi_intel17mkl/_deps/eigen3_sources-build/Eigen3Targets.cmake\r\nCall Stack (most recent call first):\r\n external/upstream/fetch_eigen3.cmake:1 (find_package)\r\n cmake/custom/main.cmake:10 (include)\r\n CMakeLists.txt:64 (include)\r\n\r\n\r\nCMake Error at /lustre/nyx/ukt/milias/work/software/mrchem_suite/mrchem/build_openmpi_intel17mkl/_deps/eigen3_sources-build/Eigen3Config.cmake:20 (include):\r\n include could not find load file:\r\n\r\n /lustre/nyx/ukt/milias/work/software/mrchem_suite/mrchem/build_openmpi_intel17mkl/_deps/eigen3_sources-build/Eigen3Targets.cmake\r\nCall Stack (most recent call first):\r\n build_icpc/mrcpp_sources-src/external/upstream/fetch_eigen3.cmake:3 (find_package)\r\n build_icpc/mrcpp_sources-src/cmake/custom/main.cmake:15 (include)\r\n build_icpc/mrcpp_sources-src/CMakeLists.txt:64 (include)\r\n~~~","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"routerLinkActive applies active class to null routerLink","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Doc suggestion: clarify version and degree of YAML spec compliance\n\nI think it would help if the README or some other piece of top-level documentation provided some guidance regarding what version or versions of the YAML spec this library aims to support, and how fully it supports that spec.\r\n\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add some docs about issue submission and expected process","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Unable to find \"crypto-json\" (\"dt\") in the registry","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Complete Documentation","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"moderator docs and malformed requests","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Merge Documentation to Wiki","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Import cheatsheet","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Community guidelines","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Test overriding","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# More documentation\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add docs about /host feature","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Improve ffmpeg configuration instructions, or automatize the process\n\nTrying to run python get_mp3.py musics.txt in a fresh Linux PopOs! installation, get the above error message. The solution it's to install and configure a ffmpeg codec to the system. \r\nI think it's a good aproach to check if it can be an automatized process for every enviroment and S.O. at the first get_mp3 run. If cannot implement this, or became much complicated, it needs to improve the readme informations to install corrrectly.\r\n\r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"admin-transactional-locking link in settings webinterface","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# BL-Touch triggering too soon sometimes\n\nHey,\r\nI'm very happy with using klipper and it's working pretty reliable.\r\nI only have a problem with my TL BL-Touch Clone.\r\nwhen I ```bed_mesh_calibrate``` I'll get the right z-distance about 70% of the time.\r\nSometimes my probe just triggers to soon without touching the bed surface.\r\na bed probe looks like this most of the time:\r\n```\r\nRecv: // probe at 50.000,50.000 is z=0.615000\r\nRecv: // probe at 50.000,50.000 is z=0.610000\r\nRecv: // probe at 50.000,50.000 is z=0.607500\r\nRecv: // probe at 111.662,50.000 is z=0.525000\r\nRecv: // probe at 111.662,50.000 is z=0.535000\r\nRecv: // probe at 111.662,50.000 is z=0.537500\r\nRecv: // probe at 173.325,50.000 is z=0.467500\r\nRecv: // probe at 173.325,50.000 is z=0.495000\r\nRecv: // probe at 173.325,50.000 is z=0.495000\r\nRecv: // probe at 234.975,50.000 is z=4.237500 -- Triggered without hitting the bed surface\r\nRecv: // probe at 234.975,50.000 is z=0.502500\r\nRecv: // probe at 234.975,50.000 is z=0.520000\r\nRecv: // probe at 234.975,108.325 is z=0.475000\r\nRecv: // probe at 234.975,108.325 is z=7.717500 -- Triggered without hitting the bed surface\r\nRecv: // probe at 234.975,108.325 is z=0.442500\r\nRecv: // probe at 173.325,108.325 is z=0.502500\r\nRecv: // probe at 173.325,108.325 is z=0.507500\r\nRecv: // probe at 173.325,108.325 is z=0.505000\r\nRecv: // probe at 111.662,108.325 is z=0.582500\r\nRecv: // probe at 111.662,108.325 is z=0.580000\r\nRecv: // probe at 111.662,108.325 is z=0.582500\r\nRecv: // probe at 50.000,108.325 is z=0.655000\r\nRecv: // probe at 50.000,108.325 is z=0.647500\r\nRecv: // probe at 50.000,108.325 is z=0.645000\r\nRecv: // probe at 50.000,166.662 is z=0.705000\r\nRecv: // probe at 50.000,166.662 is z=0.707500\r\nRecv: // probe at 50.000,166.662 is z=0.702500\r\nRecv: // probe at 111.662,166.662 is z=0.595000\r\nRecv: // probe at 111.662,166.662 is z=0.600000\r\nRecv: // probe at 111.662,166.662 is z=7.845000 -- Triggered without hitting the bed surface\r\nRecv: // probe at 173.325,166.662 is z=0.492500\r\nRecv: // probe at 173.325,166.662 is z=0.507500\r\nRecv: // probe at 173.325,166.662 is z=0.507500\r\nRecv: // probe at 234.975,166.662 is z=0.445000\r\nRecv: // probe at 234.975,166.662 is z=0.450000\r\nRecv: // probe at 234.975,166.662 is z=0.445000\r\nRecv: // probe at 234.975,224.987 is z=0.485000\r\nRecv: // probe at 234.975,224.987 is z=0.487500\r\nRecv: // probe at 234.975,224.987 is z=0.485000\r\nRecv: // probe at 173.325,224.987 is z=0.575000\r\nRecv: // probe at 173.325,224.987 is z=0.570000\r\nRecv: // probe at 173.325,224.987 is z=0.567500\r\nRecv: // probe at 111.662,224.987 is z=0.670000\r\nRecv: // probe at 111.662,224.987 is z=7.910000 -- Triggered without hitting the bed surface\r\nRecv: // probe at 111.662,224.987 is z=0.650000\r\nRecv: // probe at 50.000,224.987 is z=0.765000 \r\nRecv: // probe at 50.000,224.987 is z=0.767500 \r\n```\r\nI already played around with those bltouch section properties:\r\n```\r\n#pin_up_reports_not_triggered: True\r\n# Set if the BLTouch consistently reports the probe in a \"not\r\n# triggered\" state after a successful \"pin_up\" command. This should\r\n# be True for a genuine BLTouch; some BLTouch clones may require\r\n# False. The default is True.\r\n#pin_up_touch_mode_reports_triggered: True\r\n# Set if the BLTouch consistently reports a \"triggered\" state after\r\n# the commands \"pin_up\" followed by \"touch_mode\". This should be\r\n# True for a genuine BLTouch v2 and earlier; the BLTouch v3 and some\r\n# BLTouch clones require False. The default is True.\r\n```\r\nprinter.cfg:\r\n[Neues Textdokument.txt](https://github.com/KevinOConnor/klipper/files/3489847/Neues.Textdokument.txt)\r\nklippy.log:\r\n[klippy.log](https://github.com/KevinOConnor/klipper/files/3489856/klippy.log)\r\n\r\nDoes someone had similar issues and a solution to this?\r\nAre there some other properties in the config I can play around with?\r\nThanks a lot in advantage","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Call WebSocket service for Long-Running in .net core?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"API Rate limit error while downloading `oc` causes Minishift to not start","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Features","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Typo in Security administration page","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Resolver parameter order documented incorrectly","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"GitBook","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Feedback: 'OWIN HTTP Message Pass Through'","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Max length on Github Import\n\n# \ud83d\udc1b bug report\r\n\r\n## Description of the problem\r\n\r\nFor some reason creating a sandbox from my github repo is failing.\r\n\r\n### Steps to Recreate\r\n\r\n1. Go to [Import from Github](https://codesandbox.io/s/github)\r\n1. Paste in this url: `https://github.com/final-form/react-final-form/tree/master/examples/record-level-validation`\r\n1. Receive error: `should be at most 64 character(s)`\r\n\r\n\r\n\r\nMy url is a bit long, but this seems a little ridiculous considering the example given in the docs is: `https://github.com/reduxjs/redux/tree/master/examples/todomvc`, which is 61 characters long.\r\n\r\nAlso `https://github.com/final-form/react-final-form/tree/master/examples/subscriptions` works just fine, and it is 81 characters long, so I don't think it's the length of the Github url that is the problem.\r\n\r\nSo the question is \"_WHAT_ should be at most 64 character(s)?\"\r\n\r\n## How has this issue affected you? What are you trying to accomplish?\r\n\r\nI'm trying to keep the examples for my open source library in Github and not CodeSandbox so that people can make PRs to them.\r\n\r\n### Your Environment\r\n\r\n| Software | Name/Version |\r\n| ---------------- | ------------ |\r\n| Browser | Chrome and Brave |\r\n| Operating System | macOS Mojave |\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update README","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Where can I get the Dataset? \n\nHey, @ash-aldujaili @hembergerik @alhuang10 I have submitted the google docs form three times, but I still haven't received the dataset. I was wondering if there is something else I should do in order to qualify. \r\n\r\nAlso regarding the dataset, are the contents of the datasets actual binaries for the malwares? If so, do I then run the `generate_vectors.py` script to extract the features? ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"asset directory?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Add SW part in README.md\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Ubuntu 17.04 install - missing dep + app not working","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Build environment","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"'SSM' object has no attribute 'get_parameter'","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Document global types and API wrappers\n\nThe API wrappers and global types would benefit from documentation, as it would be helpful when IDE's are doing type-ahead assistance.\r\n\r\nWe should be able to grab the documentation in the integrators guide and add it to the relevant functions.\r\n\r\nhttps://backstage.forgerock.com/docs/idm/6.5/integrators-guide/#appendix-scripting","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Linux + Sublime + Xdebug = Not Stopping at Breakpoints","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Please add \"user_id\" support to CreateOrUpdateSubscriber","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"ReadMe add","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"macOS install requires files in `IHaskell/.stack-work`?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# add link to syntax location API\n\nI needed a way to access the location information for a syntax object. Unfortunately, the place where the concept is introduced \r\n\r\nhttps://docs.racket-lang.org/reference/syntax-model.html?q=syntax%20objects#%28tech._syntax._object%29\r\n\r\nhas links to all the parts _except_ the phrase \"source-location information\". Through a convoluted search I got what I needed from\r\n\r\nhttps://docs.racket-lang.org/reference/stxops.html\r\n\r\nSuggestions:\r\n\r\n1. Have the location phrase also be a link.\r\n\r\n2. Add a link to the above docs (which provide the elimination forms) associated with the introduction of the concept.\r\n\r\n3. On the stxops page, have the literal word \"location\" (which is used consistently for this concept) appear with the libraries that actually provide (components of) that information.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Node samples should add or modify examples to match the tutorials","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"run errors ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Calling reindex on record subset replaces entire index","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"congestion","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"MacOS Sierra 10.12.6 inc_vendor.cl file not found","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Refactor readme to an table of content","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Next steps\n\n## Nice work\n\n\n\nCongratulations @anesta95, you've completed this course!\n\n### What went well\n\nBefore I say good-bye, here's a recap of all the tasks you've accomplished in your repository:\n\n- You learned why merge conflicts happen\n- You resolved a simple merge conflict\n- You resolved a multi-file merge conflict \n- You created a merge conflict, and resolved it!\n\n### What's next?\n\nHere are some instructions you can use to keep working on your resum\u00e9:\n\n<details>\n <summary>Finishing the resume</summary>\n <hr>\n \n #### Finishing the resume\n \n To update the other sections of the resume, create a new branch and edit the files found in the `_data` folder.\n\n For example, to modify the \"Projects\" section, edit the `_data/projects.yml` file. After making your changes, create a new pull request and merge your changes.\n \n <hr>\n</details>\n\n<details>\n <summary>Changing the picture</summary>\n <hr>\n \n #### Changing the picture\n \n If you would like to change the image used on your resume, you need to make a few changes to the files.\n\n 1. Create a new branch, maybe name it something like `new-avatar`.\n 1. Navigate to the `images` directory and click the **Upload files** button.\n 1. [Drag and drop your image](https://help.github.com/articles/adding-a-file-to-a-repository/).\n 1. Commit your change by clicking **Commit changes**.\n 1. On the `new-avatar` branch, open the `_layouts/resume.html` file and edit line 16. Replace `images/bob-avatar.jpg` with `images/YOURFILENAME`.\n 1. Create a pull request.\n 1. Merge the pull request, and delete the branch.\n \n <hr>\n</details>\n\n<details>\n <summary>Enabling GitHub Pages</summary>\n <hr>\n \n #### Enabling GitHub Pages\n \n When you are happy with your resume, you will need to publish it with GitHub Pages. This resume is ready for GitHub Pages, you just need to turn it on. Follow these steps to enable GitHub Pages when you are ready to publish your resume:\n\n 1. Click on the [**Settings**](https://github.com/anesta95/merge-conflicts/settings) tab.\n 1. Scroll to the \"GitHub Pages\" section.\n 1. In the \"Source\" drop-down, select **master branch**.\n 1. Click **Save**.\n 1. :construction: Warning! :construction: Make sure you don't see any [errors after you select save](https://user-images.githubusercontent.com/13326548/36769372-bc9b43d4-1bf8-11e8-8050-2b08cf8d146b.png). If you do, your page won't build correctly and this step will be incomplete.\n \nYour GitHub Pages resum\u00e9 site will be live very shortly. Check it out here:\n\n\n https://anesta95.github.io/merge-conflicts\n\n \n <hr>\n</details>\n\n### Keep Learning\n\nWant to work on resolving merge conflicts using the command line? Check out [this documentation](https://help.github.com/articles/resolving-a-merge-conflict-using-the-command-line/).\n\nWant to keep learning? Feel free to [check out our other courses](https://lab.github.com/courses)?\n\n<hr>\n<h3 align=\"center\">I won't respond to this issue, go ahead and close it when finished!</h3>\n\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# IAM user creation for AK/SK user not possible\n\nHi there,\r\n\r\nI want to create n users with AK/SK pairs. For example technical users.\r\n\r\nI want to create n users with a password. For new colleagues.\r\n\r\nIn the UI I can choose if a user should be a AK/SK OR a password user.\r\n\r\nUnfortunalty I cant choose between these options using terraform.\r\nThis might be a limitation within the API since I couldnt find any hint for it in the API-Docs.\r\nSo I opened a parallel support request\r\n\r\n### Terraform Version\r\n```bash\r\n$ terraform --version\r\nTerraform v0.11.14\r\n```\r\n\r\n### Affected Resource(s)\r\n- opentelekomcloud_identity_user_v3\r\n\r\n### Terraform Configuration Files\r\n```hcl\r\nresource \"opentelekomcloud_identity_user_v3\" \"user\" {\r\n name = \"${var.username}\"\r\n password = \"${var.password}\"\r\n default_project_id = \"${var.default_project_id}\"\r\n domain_id = \"${var.domain_id}\"\r\n enabled = \"${var.enabled}\"\r\n}\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# PROBLEM WITH A COMPILATION MARLIN 2.0.X ON RE-ARM\n\nI adquired a re-arm board and i have a issues with the compilation, (I use the ATOM software), y follow a tutorials.\r\nI search on the web about of this problem and on this issues, i cant foun nothing.\r\nSorry for my english.\r\nWhen i go to create the PIO Building, the program show me this erros, I search on the directory , i see all are normal,\r\nThis are the problems with the pins (is possible is clear the showme , but i cant found):\r\n \r\n\r\nVerbose mode can be enabled via `-v, --verbose` option\r\nCONFIGURATION: https://docs.platformio.org/page/boards/nxplpc-arduino-lpc176x/nxp_lpc1768.html\r\nPLATFORM: NXP Arduino LPC176x 0.0.2 > NXP LPC1768\r\nHARDWARE: LPC1768 100MHz, 31.80KB RAM, 464KB Flash\r\nDEBUG: Current (cmsis-dap) On-board (cmsis-dap) External (blackmagic, jlink)\r\nPACKAGES: framework-arduino-lpc176x 0.1.3, toolchain-gccarmnoneeabi 1.70201.0 (7.2.1)\r\nConverting Marlin.ino\r\n\r\nLDF: Library Dependency Finder -> http://bit.ly/configure-pio-ldf\r\nLDF Modes: Finder ~ off, Compatibility ~ strict\r\nFound 9 compatible libraries\r\nScanning dependencies...\r\nDependency Graph\r\n|-- <Servo> 1.0.0\r\n|-- <LiquidCrystal> 1.0.0\r\n|-- <U8glib-HAL> 0.4\r\n|-- <TMCStepper> 0.4.6\r\n|-- <Adafruit NeoPixel> 1.2.4\r\n|-- <SailfishLCD>\r\n \r\nUnable to find destination disk (Autodetect Error)\r\nPlease select it in platformio.ini using the upload_port keyword (https://docs.platformio.org/en/latest/projectconf/section_env_upl\r\noad.html) or copy the firmware (.pio/build/LPC1768/firmware.bin) manually to the appropriate disk\r\n\r\n\r\ncompiling .pio\\build\\LPC1768\\src\\src\\HAL\\HAL_LPC1768\\DebugMonitor_LPC1768.cpp.o\r\nCompiling .pio\\build\\LPC1768\\src\\src\\HAL\\HAL_LPC1768\\HAL.cpp.o\r\nCompiling .pio\\build\\LPC1768\\src\\src\\HAL\\HAL_LPC1768\\HAL_spi.cpp.o\r\nCompiling .pio\\build\\LPC1768\\src\\src\\HAL\\HAL_LPC1768\\HAL_timers.cpp.o\r\nIn file included from Marlin\\src\\HAL\\HAL_LPC1768\\../../core/../inc/MarlinConfig.h:32:0,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\../../core/serial.h:24,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\DebugMonitor_LPC1768.cpp:26:\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../core/../inc/../pins/pins.h:537:4: error: #error \"Unknown MOTHERBOARD value set in Configuration.h\"\r\n #error \"Unknown MOTHERBOARD value set in Configuration.h\"\r\n ^~~~~\r\nIn file included from Marlin\\src\\HAL\\HAL_LPC1768\\../../inc/MarlinConfig.h:32:0,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\HAL.cpp:25:\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/../pins/pins.h:537:4: error: #error \"Unknown MOTHERBOARD value set in Configuration.h\"\r\n #error \"Unknown MOTHERBOARD value set in Configuration.h\"\r\n ^~~~~\r\nIn file included from Marlin\\src\\HAL\\HAL_LPC1768\\../../core/../inc/MarlinConfig.h:38:0,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\../../core/serial.h:24,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\DebugMonitor_LPC1768.cpp:26:\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../core/../inc/SanityCheck.h:1457:4: error: #error \"HEATER_0_PIN not defined for this board.\"\r\n\r\n #error \"HEATER_0_PIN not defined for this board.\"\r\n ^~~~~\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../core/../inc/SanityCheck.h:1599:8: error: #error \"E0_STEP_PIN, E0_DIR_PIN, or E0_ENABLE_PIN not def\r\nined for this board.\"\r\n #error \"E0_STEP_PIN, E0_DIR_PIN, or E0_ENABLE_PIN not defined for this board.\"\r\n ^~~~~\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../core/../inc/SanityCheck.h:1603:10: error: #error \"E1_STEP_PIN, E1_DIR_PIN, or E1_ENABLE_PIN not de\r\nfined for this board.\"\r\n #error \"E1_STEP_PIN, E1_DIR_PIN, or E1_ENABLE_PIN not defined for this board.\"\r\n ^~~~~\r\nIn file included from Marlin\\src\\HAL\\HAL_LPC1768\\../../inc/MarlinConfig.h:38:0,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\HAL.cpp:25:\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1457:4: error: #error \"HEATER_0_PIN not defined for this board.\"\r\n #error \"HEATER_0_PIN not defined for this board.\"\r\n ^~~~~\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1599:8: error: #error \"E0_STEP_PIN, E0_DIR_PIN, or E0_ENABLE_PIN not defined for\r\n this board.\"\r\n #error \"E0_STEP_PIN, E0_DIR_PIN, or E0_ENABLE_PIN not defined for this board.\"\r\n ^~~~~\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1603:10: error: #error \"E1_STEP_PIN, E1_DIR_PIN, or E1_ENABLE_PIN not defined fo\r\nr this board.\"\r\n #error \"E1_STEP_PIN, E1_DIR_PIN, or E1_ENABLE_PIN not defined for this board.\"\r\n ^~~~~\r\nIn file included from Marlin\\src\\HAL\\HAL_LPC1768\\../../inc/MarlinConfig.h:32:0,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\HAL_spi.cpp:51:\r\n\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/../pins/pins.h:537:4: error: #error \"Unknown MOTHERBOARD value set in Configuration.h\"\r\n #error \"Unknown MOTHERBOARD value set in Configuration.h\"\r\n ^~~~~\r\nIn file included from Marlin\\src\\HAL\\HAL_LPC1768\\../../inc/MarlinConfig.h:32:0,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\HAL_timers.cpp:31:\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/../pins/pins.h:537:4: error: #error \"Unknown MOTHERBOARD value set in Configuration.h\"\r\n #error \"Unknown MOTHERBOARD value set in Configuration.h\"\r\n ^~~~~\r\nIn file included from Marlin\\src\\HAL\\HAL_LPC1768\\../../inc/MarlinConfig.h:38:0,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\HAL_spi.cpp:51:\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1457:4: error: #error \"HEATER_0_PIN not defined for this board.\"\r\n #error \"HEATER_0_PIN not defined for this board.\"\r\n ^~~~~\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1599:8: error: #error \"E0_STEP_PIN, E0_DIR_PIN, or E0_ENABLE_PIN not defined for\r\n this board.\"\r\n #error \"E0_STEP_PIN, E0_DIR_PIN, or E0_ENABLE_PIN not defined for this board.\"\r\n ^~~~~\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1603:10: error: #error \"E1_STEP_PIN, E1_DIR_PIN, or E1_ENABLE_PIN not defined fo\r\nr this board.\"\r\n #error \"E1_STEP_PIN, E1_DIR_PIN, or E1_ENABLE_PIN not defined for this board.\"\r\n ^~~~~\r\nIn file included from Marlin\\src\\HAL\\HAL_LPC1768\\../../inc/MarlinConfig.h:38:0,\r\n from Marlin\\src\\HAL\\HAL_LPC1768\\HAL_timers.cpp:31:\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1457:4: error: #error \"HEATER_0_PIN not defined for this board.\"\r\n #error \"HEATER_0_PIN not defined for this board.\"\r\n\r\n ^~~~~\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1599:8: error: #error \"E0_STEP_PIN, E0_DIR_PIN, or E0_ENABLE_PIN not defined for\r\n this board.\"\r\n #error \"E0_STEP_PIN, E0_DIR_PIN, or E0_ENABLE_PIN not defined for this board.\"\r\n ^~~~~\r\nMarlin\\src\\HAL\\HAL_LPC1768\\../../inc/SanityCheck.h:1603:10: error: #error \"E1_STEP_PIN, E1_DIR_PIN, or E1_ENABLE_PIN not defined fo\r\nr this board.\"\r\n #error \"E1_STEP_PIN, E1_DIR_PIN, or E1_ENABLE_PIN not defined for this board.\"\r\n ^~~~~\r\n*** [.pio\\build\\LPC1768\\src\\src\\HAL\\HAL_LPC1768\\DebugMonitor_LPC1768.cpp.o] Error 1\r\n*** [.pio\\build\\LPC1768\\src\\src\\HAL\\HAL_LPC1768\\HAL_spi.cpp.o] Error 1\r\n*** [.pio\\build\\LPC1768\\src\\src\\HAL\\HAL_LPC1768\\HAL.cpp.o] Error 1\r\n*** [.pio\\build\\LPC1768\\src\\src\\HAL\\HAL_LPC1768\\HAL_timers.cpp.o] Error 1\r\n========================== [ERROR] Took 5.20 seconds ==========================\r\n \r\n================================== [SUMMARY] ==================================\r\nEnvironment megaatmega2560 [IGNORED]\r\nEnvironment megaatmega1280 [IGNORED]\r\nEnvironment at90usb1286_cdc [IGNORED]\r\nEnvironment at90usb1286_dfu [IGNORED]\r\nEnvironment DUE [IGNORED]\r\nEnvironment DUE_USB [IGNORED]\r\nEnvironment DUE_debug [IGNORED]\r\nEnvironment LPC1769 [IGNORED]\r\nEnvironment melzi [IGNORED]\r\nEnvironment melzi_optiboot [IGNORED]\r\nEnvironment rambo [IGNORED]\r\nEnvironment sanguino_atmega644p [IGNORED]\r\nEnvironment sanguino_atmega1284p [IGNORED]\r\n\r\nEnvironment STM32F1 [IGNORED]\r\nEnvironment fysetc_STM32F1 [IGNORED]\r\nEnvironment BIGTREE_SKR_MINI [IGNORED]\r\nEnvironment STM32F4 [IGNORED]\r\nEnvironment STM32F7 [IGNORED]\r\nEnvironment ARMED [IGNORED]\r\nEnvironment alfawise_U20 [IGNORED]\r\nEnvironment mks_robin [IGNORED]\r\nEnvironment mks_robin_lite [IGNORED]\r\nEnvironment mks_robin_mini [IGNORED]\r\nEnvironment mks_robin_nano [IGNORED]\r\nEnvironment jgaurora_a5s_a1 [IGNORED]\r\nEnvironment black_stm32f407ve [IGNORED]\r\nEnvironment BIGTREE_SKR_PRO [IGNORED]\r\nEnvironment teensy31 [IGNORED]\r\nEnvironment teensy35 [IGNORED]\r\nEnvironment malyanm200 [IGNORED]\r\nEnvironment esp32 [IGNORED]\r\nEnvironment fysetc_f6_13 [IGNORED]\r\nEnvironment linux_native [IGNORED]\r\nEnvironment adafruit_grandcentral_m4 [IGNORED]\r\nEnvironment LPC1768 [FAILED]\r\n==================== 1 failed, 0 succeeded in 5.21 seconds ====================\r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Implement all Stripe errors","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Developer Portal Documentation typo\n\n\r\nHere,\r\n\r\nthis \"of of\". \r\n\r\n\r\nYou all are amazing, peace!","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# ALTER DATABASE ... WITH NO_WAIT\n\nI am learning more about ALTER DATABASE [AdventureWorks2017] SET RECOVERY options. When I open the database properties dialog box, go to options page, change the Recovery model from Simple to Full, click the Script action to new query window, SQL Server 2017 SSMS v18.2, creates a script: \n ALTER DATABASE [AdventureWorks2017] SET RECOVERY WITH NO_WAIT\n\nI didn't see \"WITH NO_WAIT\" in the ALTER DATABASE documentation. It would be nice if it was there.\n\nBest regards,\nJody \n\n---\n#### Document Details\n\n\u26a0 *Do not edit this section. It is required for docs.microsoft.com \u279f GitHub issue linking.*\n\n* ID: 8d76ab6e-5a3d-94eb-26f9-b2805b35ea85\n* Version Independent ID: 2ac43d27-c318-3e32-1e78-55229975eb4e\n* Content: [ALTER DATABASE SET Options (Transact-SQL) - SQL Server](https://docs.microsoft.com/en-us/sql/t-sql/statements/alter-database-transact-sql-set-options?view=sql-server-2017#feedback)\n* Content Source: [docs/t-sql/statements/alter-database-transact-sql-set-options.md](https://github.com/MicrosoftDocs/sql-docs/blob/live/docs/t-sql/statements/alter-database-transact-sql-set-options.md)\n* Product: **sql**\n* Technology: **t-sql**\n* GitHub Login: @CarlRabeler\n* Microsoft Alias: **carlrab**","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Write simple installation instructions for pandas\n\npandas has a quite complete documentation page with installation instructions:\r\n\r\nhttps://pandas.pydata.org/pandas-docs/stable/install.html\r\n\r\nBut personally, I don't want users that come to pandas for the first time (and possibly to the whole PyData ecosystem) to find this document and have to use it to get started. I think it makes much more sense to provide a short document with how we think they should set everything up, and then a link to the existing document if they are interested in `Advanced installation instructions`.\r\n\r\nWhat I think we should provide is a document that explains step by step:\r\n- How to get Anaconda\r\n- How to set up an environment with pandas\r\n- How to import and possibly show the pandas version\r\n- Link to the tutorials (we are working on them) and to the advanced installation instructions if they want more options\r\n\r\nI'm unsure on whether the instructions should show how to get the environment directly in JupyterLab, show first in a Python terminal and then in JupyterLab, or ignore JupyterLab and just show with the Python terminal.\r\n\r\nI think it may make sense to have screenshots and make the installation instructions as easy and visual as possible.\r\n\r\nFor now they should go into our `/doc/` directory, and when we're happy with them, we'll open the PR in pandas.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Eliminate reduce() ?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# could you add more function for this theme?\n\n<!-- use this template only to report a bug or ask a question -->\r\n<!-- fill this part for bugs reporting or questions -->\r\n### Configuration\r\n\r\n - **Operating system with its version**: Win10\r\n - **Browser with its version**:chrome 76\r\n - **Hugo version**: <!-- You can get version by typing: hugo version -->v0.56.3-F637A1EA\r\n - **Tranquilpeak version**: <!-- You can find version on package.json -->0.4.6-BETA\r\n - **Do you reproduce on https://tranquilpeak.kakawait.com demo?**:\r\n \r\n<!-- fill this part for bugs reporting if needed -->\r\n### Actual behavior\r\n\r\n<!-- fill this part for bugs reporting if needed -->\r\n### Expected behavior\r\nI'm a China user.thank you. this theme is awesome. In China disqus_thread is not work ,but **utteranc** can work good. could you add this comment tool support for this theme?\r\n\r\nanother question: Article does not support counting\uff0ccould you add this?\r\nI find another theme,support above function,you can refer to it.website:https://github.com/rujews/maupassant-hugo/blob/master/README_EN.md\r\n\r\nthank you again.\r\n<!-- fill this part for bugs reporting if needed -->\r\n### Steps to reproduce the behavior","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2019-1010266 (Medium) detected in lodash-1.0.2.tgz\n\n## CVE-2019-1010266 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-1.0.2.tgz</b></p></summary>\n\n<p>A utility library delivering consistency, customization, performance, and extras.</p>\n<p>Library home page: <a href=\"https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz\">https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz</a></p>\n<p>Path to dependency file: /website/docs/package.json</p>\n<p>Path to vulnerable library: /tmp/git/website/docs/node_modules/lodash/package.json</p>\n<p>\n\nDependency Hierarchy:\n - gulp-3.9.1.tgz (Root Library)\n - vinyl-fs-0.3.14.tgz\n - glob-watcher-0.0.6.tgz\n - gaze-0.5.2.tgz\n - globule-0.1.0.tgz\n - :x: **lodash-1.0.2.tgz** (Vulnerable Library)\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nlodash prior to 4.17.11 is affected by: CWE-400: Uncontrolled Resource Consumption. The impact is: Denial of service. The component is: Date handler. The attack vector is: Attacker provides very long strings, which the library attempts to match using a regular expression. The fixed version is: 4.17.11.\n\n<p>Publish Date: 2019-07-17\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266>CVE-2019-1010266</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: Low\n - User Interaction: None\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-1010266</a></p>\n<p>Release Date: 2019-07-17</p>\n<p>Fix Resolution: 4.17.11</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Mail stored in Mailgun, but not appearing in Odoo","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"UnsupportedAlgorithm: Backend object does not implement ScryptBackend","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Optional API parameter bug\n\n**Bug Description**\r\n\r\nIf all parameters of an API are `optional`, calling the API with no parameter will crash the node. Fortunately we don't have such API in production.\r\n\r\nA sample API: https://github.com/bitshares/bitshares-core/blob/c5e8585576237e4e87e9bc9b8c90d4e45f351502/libraries/app/include/graphene/app/database_api.hpp#L407-L409\r\n\r\nStack back trace:\r\n\r\n>#0 0x00000000010ee5b0 in fc::variant::get_type() const ()\r\n#1 0x00000000010ee799 in fc::variant::is_null() const ()\r\n#2 0x0000000000b9451b in std::vector<graphene::app::general_asset_info, std::allocator<graphene::app::general_asset_info> > fc::generic_api::call_generic<std::vector<graphene::app::general_asset_info, std::allocator<graphene::app::general_asset_info> >, fc::optional<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, fc::optional<unsigned int>, fc::optional<graphene::app::general_asset_info::asset_type> >(std::function<std::vector<graphene::app::general_asset_info, std::allocator<graphene::app::general_asset_info> > (fc::optional<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, fc::optional<unsigned int>, fc::optional<graphene::app::general_asset_info::asset_type>)> const&, __gnu_cxx::__normal_iterator<fc::variant const*, std::vector<fc::variant, std::allocator<fc::variant> > >, __gnu_cxx::__normal_iterator<fc::variant const*, std::vector<fc::variant, std::allocator<fc::variant> > >, unsigned int) ()\r\n#3 0x0000000000b955aa in std::_Function_handler<fc::variant (std::vector<fc::variant, std::allocator<fc::variant> > const&), std::function<fc::variant (std::vector<fc::variant, std::allocator<fc::variant> > const&)> fc::generic_api::api_visitor::to_generic<std::vector<graphene::app::general_asset_info, std::allocator<graphene::app::general_asset_info> >, fc::optional<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, fc::optional<unsigned int>, fc::optional<graphene::app::general_asset_info::asset_type> >(std::function<std::vector<graphene::app::general_asset_info, std::allocator<graphene::app::general_asset_info> > (fc::optional<std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > >, fc::optional<unsigned int>, fc::optional<graphene::app::general_asset_info::asset_type>)> const&) const::{lambda(std::vector<fc::variant, std::allocator<fc::variant> > const&)#1}>::_M_invoke(std::_Any_data const&, std::vector<fc::variant, std::allocator<fc::variant> > const&) ()\r\n#4 0x000000000116f541 in fc::generic_api::call(unsigned int, std::vector<fc::variant, std::allocator<fc::variant> > const&) ()\r\n#5 0x000000000116f8ac in fc::generic_api::call(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<fc::variant, std::allocator<fc::variant> > const&) ()\r\n#6 0x000000000116fb7a in fc::api_connection::receive_call(unsigned int, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<fc::variant, std::allocator<fc::variant> > const&) const ()\r\n#7 0x0000000001169f2c in std::_Function_handler<fc::variant (std::vector<fc::variant, std::allocator<fc::variant> > const&), fc::rpc::websocket_api_connection::websocket_api_connection(std::shared_ptr<fc::http::websocket_connection> const&, unsigned int)::{lambda(std::vector<fc::variant, std::allocator<fc::variant> > const&)#1}>::_M_invoke(std::_Any_data const&, std::vector<fc::variant, std::allocator<fc::variant> > const&) ()\r\n#8 0x00000000012261d4 in fc::rpc::state::local_call(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::vector<fc::variant, std::allocator<fc::variant> > const&) ()\r\n#9 0x000000000116b636 in fc::rpc::websocket_api_connection::on_request(fc::variant const&) ()\r\n#10 0x000000000116d1d4 in fc::rpc::websocket_api_connection::on_message(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) ()\r\n#11 0x000000000116ddea in std::_Function_handler<fc::http::reply (std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&), fc::rpc::websocket_api_connection::websocket_api_connection(std::shared_ptr<fc::http::websocket_connection> const&, unsigned int)::{lambda(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)#6}>::_M_invoke(std::_Any_data const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&) ()\r\n#12 0x0000000001215d13 in fc::http::detail::websocket_server_impl::websocket_server_impl()::{lambda(std::weak_ptr<void>)#4}::operator()(std::weak_ptr<void>) const::{lambda()#1}::operator()() const::{lambda()#1}::operator()() const ()\r\n#13 0x0000000001216439 in fc::detail::void_functor_run<fc::http::detail::websocket_server_impl::websocket_server_impl()::{lambda(std::weak_ptr<void>)#4}::operator()(std::weak_ptr<void>) const::{lambda()#1}::operator()() const::{lambda()#1}>::run(void*, {lambda()#1}) ()\r\n#14 0x0000000001113eb4 in fc::task_base::run_impl() ()\r\n#15 0x000000000111236f in fc::thread_d::process_tasks() ()\r\n#16 0x0000000001112b1c in fc::thread_d::start_process_tasks(long) ()\r\n#17 0x0000000001853e61 in make_fcontext ()\r\n#18 0x0000000000000000 in ?? ()\r\n\r\n\r\n**Impacts**\r\nDescribe which portion(s) of BitShares Core may be impacted by this bug. Please tick at least one box.\r\n- [x] API (the application programming interface)\r\n- [ ] Build (the build process or something prior to compiled code)\r\n- [ ] CLI (the command line wallet)\r\n- [ ] Deployment (the deployment process after building such as Docker, Travis, etc.)\r\n- [ ] DEX (the Decentralized EXchange, market engine, etc.)\r\n- [ ] P2P (the peer-to-peer network for transaction/block propagation)\r\n- [ ] Performance (system or user efficiency, etc.)\r\n- [ ] Protocol (the blockchain logic, consensus, validation, etc.)\r\n- [ ] Security (the security of system or user data, etc.)\r\n- [ ] UX (the User Experience)\r\n- [ ] Other (please add below)\r\n\r\n**Steps To Reproduce**\r\nSteps to reproduce the behavior (example outlined below):\r\n1. Execute API call '...'\r\n2. Using JSON payload '...'\r\n3. Received response '...'\r\n4. See error in screenshot\r\n\r\n**Expected Behavior**\r\nA clear and concise description of what you expected to happen.\r\n\r\n**Screenshots (optional)**\r\nIf applicable, add screenshots to help explain process flow and behavior.\r\n\r\n**Host Environment**\r\nPlease provide details about the host environment. Much of this information can be found running: `witness_node --version`. \r\n - Host OS: [e.g. Ubuntu 18.04 LTS]\r\n - Host Physical RAM [e.g. 4GB]\r\n - BitShares Version: [e.g. 2.0.180425]\r\n - OpenSSL Version: [e.g. 1.1.0g]\r\n - Boost Version: [e.g. 1.65.1]\r\n \r\n**Additional Context (optional)**\r\nAdd any other context about the problem here.\r\n\r\n## CORE TEAM TASK LIST\r\n- [ ] Evaluate / Prioritize Bug Report\r\n- [ ] Refine User Stories / Requirements\r\n- [ ] Define Test Cases\r\n- [ ] Design / Develop Solution\r\n- [ ] Perform QA/Testing\r\n- [ ] Update Documentation\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Move HW details from README.md to Wiki\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Some questions about gitmint.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Managing Neo field on the frontend","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Need Documentation of functions which can be called using python SDK","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Update the Java/Android library to parallel the .NET version","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Clarify coding principles","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Login, promptToInstall: false, opens Google Play Store for downloading LinkedIn app on Android","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Improve documentation","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Slow omnicompletion for object members","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2018-19797 (Medium) detected in CSS::Sass-v3.6.0\n\n## CVE-2018-19797 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>CSS::Sassv3.6.0</b></p></summary>\n<p>\n\n<p>Library home page: <a href=https://metacpan.org/pod/CSS::Sass>https://metacpan.org/pod/CSS::Sass</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (63)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/unchecked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/output.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/cencode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/base.h\n - /website/docs/node_modules/node-sass/src/libsass/src/json.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/position.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp\n - /website/docs/node_modules/node-sass/src/libsass/contrib/plugin.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/core.h\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/functions.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_superselector.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cencode.c\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/base64vlq.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/c99func.c\n - /website/docs/node_modules/node-sass/src/libsass/src/position.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/values.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass2scss.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/paths.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/context.h\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_unification.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.cpp\n - /website/docs/node_modules/node-sass/src/libsass/script/test-leaks.pl\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/json.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/encode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/checked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debug.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass2scss.h\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nIn LibSass 3.5.5, a NULL Pointer Dereference in the function Sass::Selector_List::populate_extends in SharedPtr.hpp (used by ast.cpp and ast_selectors.cpp) may cause a Denial of Service (application crash) via a crafted sass input file.\n\n<p>Publish Date: 2018-12-03\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19797>CVE-2018-19797</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Write README.md for asc-ui\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"[PRE REVIEW]: Pipengine","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Run with docker\n\nlink to my previous issue #1 , to run with docker suggestion from domoticz can be used [link](https://github.com/fchauveau/rpi-domoticz-docker#pro-tips)\r\nCould you please update readme","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"-d option do not support an absolute path","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"MSB1003 when *nix system has a /m file or directory","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Compiling from source: no such module 'SourceKittenFramework'","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# broken links in docs\n\nI was reading docs here https://docs.rs/conrod/0.61.1/conrod/guide/chapter_1/index.html and noticed there are broken links in the document. They all link to examples. I'll list all the ones I find.\r\n\r\n| (source) Line Number | Text |\r\n|------|------|\r\n|26|`[all_widgets.rs example](https://github.com/PistonDevelopers/conrod/blob/master/examples/all_widgets.rs).`|\r\n|34|`**The [all_piston_window.rs example](https://github.com/PistonDevelopers/conrod/blob/master/examples/all_piston_window.rs).**`|\r\n|38|`**The [canvas.rs example](https://github.com/PistonDevelopers/conrod/blob/master/examples/canvas.rs).**`|\r\n\r\nIt seems like these example projects all still exist. `all_piston_window.rs` for example seems to have been moved to\r\nhttps://github.com/PistonDevelopers/conrod/blob/master/backends/conrod_piston/examples/all_piston_window.rs\r\n\r\nIt seems like a similar thing has happened before (https://github.com/PistonDevelopers/conrod/issues/1257) I wonder if there's some CI thing that could easily be done to make sure issues like this are brought to light immediately.\r\n\r\nI would be happy to submit a PR.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Upgrade to pac4j v2.x","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# ERROR: update firefly to latest Docker\n\nHello,\r\n\r\ni got error after firelfy latest update:\r\n\r\n> [Sun Aug 11 09:28:36.648728 2019] [php7:error] [pid 277] [client 172.17.0.1:40156] PHP Fatal error: Uncaught RuntimeException: No application encryption key has been specified. in /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Encryption/EncryptionServiceProvider.php:44\\nStack trace:\\n#0 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Support/helpers.php(1124): Illuminate\\\\Encryption\\\\EncryptionServiceProvider->Illuminate\\\\Encryption\\\\{closure}(NULL)\\n#1 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Encryption/EncryptionServiceProvider.php(48): tap(NULL, Object(Closure))\\n#2 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Encryption/EncryptionServiceProvider.php(24): Illuminate\\\\Encryption\\\\EncryptionServiceProvider->key(Array)\\n#3 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Container/Container.php(787): Illuminate\\\\Encryption\\\\EncryptionServiceProvider->Illuminate\\\\Encryption\\\\{closure}(Object(Illuminate\\\\Foundation\\\\Application), Array)\\n#4 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Container/Container.php(667): Illuminate\\\\Container\\\\C in /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Encryption/EncryptionServiceProvider.php on line 44, referer: https://192.168.1.111/\r\n[Sun Aug 11 09:28:36.649240 2019] [php7:error] [pid 277] [client 172.17.0.1:40156] PHP Fatal error: Uncaught RuntimeException: No application encryption key has been specified. in /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Encryption/EncryptionServiceProvider.php:44\\nStack trace:\\n#0 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Support/helpers.php(1124): Illuminate\\\\Encryption\\\\EncryptionServiceProvider->Illuminate\\\\Encryption\\\\{closure}(NULL)\\n#1 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Encryption/EncryptionServiceProvider.php(48): tap(NULL, Object(Closure))\\n#2 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Encryption/EncryptionServiceProvider.php(24): Illuminate\\\\Encryption\\\\EncryptionServiceProvider->key(Array)\\n#3 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Container/Container.php(787): Illuminate\\\\Encryption\\\\EncryptionServiceProvider->Illuminate\\\\Encryption\\\\{closure}(Object(Illuminate\\\\Foundation\\\\Application), Array)\\n#4 /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Container/Container.php(667): Illuminate\\\\Container\\\\C in /var/www/firefly-iii/vendor/laravel/framework/src/Illuminate/Encryption/EncryptionServiceProvider.php on line 44, referer: https://192.168.1.111/\r\n172.17.0.1 - - [11/Aug/2019:09:28:36 +0300] \"GET / HTTP/1.1\" 500 211 \"https://192.168.1.111/\" \"Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/76.0.3809.100 Safari/537.36\"\r\n\r\nDocker version 19.03.1\r\n\"Ubuntu 18.04.3 LTS\"\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# [Bug] docker-compose.yml\n\nDocs for `docker-compose.yml` show this:\r\n\r\n````\r\nimage: fanningert/docker-taiga\r\n````\r\n\r\nWhich should be this:\r\n````\r\nimage: m0wer/docker-taiga\r\n````\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2016-10540 (High) detected in minimatch-0.2.14.tgz, minimatch-2.0.10.tgz\n\n## CVE-2016-10540 - High Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>minimatch-0.2.14.tgz</b>, <b>minimatch-2.0.10.tgz</b></p></summary>\n<p>\n\n<details><summary><b>minimatch-0.2.14.tgz</b></p></summary>\n\n<p>a glob matcher in javascript</p>\n<p>Library home page: <a href=\"https://registry.npmjs.org/minimatch/-/minimatch-0.2.14.tgz\">https://registry.npmjs.org/minimatch/-/minimatch-0.2.14.tgz</a></p>\n<p>Path to dependency file: /website/docs/package.json</p>\n<p>Path to vulnerable library: /tmp/git/website/docs/node_modules/globule/node_modules/minimatch/package.json</p>\n<p>\n\nDependency Hierarchy:\n - gulp-3.9.1.tgz (Root Library)\n - vinyl-fs-0.3.14.tgz\n - glob-watcher-0.0.6.tgz\n - gaze-0.5.2.tgz\n - globule-0.1.0.tgz\n - :x: **minimatch-0.2.14.tgz** (Vulnerable Library)\n</details>\n<details><summary><b>minimatch-2.0.10.tgz</b></p></summary>\n\n<p>a glob matcher in javascript</p>\n<p>Library home page: <a href=\"https://registry.npmjs.org/minimatch/-/minimatch-2.0.10.tgz\">https://registry.npmjs.org/minimatch/-/minimatch-2.0.10.tgz</a></p>\n<p>Path to dependency file: /website/docs/package.json</p>\n<p>Path to vulnerable library: /tmp/git/website/docs/node_modules/minimatch/package.json</p>\n<p>\n\nDependency Hierarchy:\n - gulp-3.9.1.tgz (Root Library)\n - vinyl-fs-0.3.14.tgz\n - glob-stream-3.1.18.tgz\n - :x: **minimatch-2.0.10.tgz** (Vulnerable Library)\n</details>\n\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nMinimatch is a minimal matching utility that works by converting glob expressions into JavaScript `RegExp` objects. The primary function, `minimatch(path, pattern)` in Minimatch 3.0.1 and earlier is vulnerable to ReDoS in the `pattern` parameter.\n\n<p>Publish Date: 2018-05-31\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10540>CVE-2016-10540</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: None\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://nodesecurity.io/advisories/118\">https://nodesecurity.io/advisories/118</a></p>\n<p>Release Date: 2016-06-20</p>\n<p>Fix Resolution: Update to version 3.0.2 or later.</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Wrong semantic for immutable @ConfigurationProperties contributed via @Import\n\nHello,\r\n\r\nAs of SB 2.2.0.M5, the changelog states that `@ConfigurationProperties-annotated types are no longer scanned in slice tests unless imported explicitly. This restores the behaviour that slice tests should only scan what is described in the documentation.`\r\n\r\nWhen an immutable `@ConfigurationProperties` is imported in a test slice, and the constructor parameters are not beans (ex : strings), the test fail.\r\n\r\nAs it's not that easy to describe, I have done a repository that reproduces the issue : https://github.com/mpalourdio/demorepro . Please run the [single test](https://github.com/mpalourdio/demorepro/blob/c26895768629855309e963b205498fb009d38293/src/test/java/com/example/demo/MyControllerTest.java) to see it fail.\r\n\r\nIn order to make it work with SB 2.0.0.M4, remove the `@Import` in the test class, and change the SB version in `pom.xml` to target SB 2.0.0.M4. The test will succeed.\r\n\r\nThanks by advance, and let me know if you need more information.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Material not working in new CLI application","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"hydro-devel Installation instructions are incomplete - missing dependencies","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"/config/firstboot file should be removed","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Feature Request: Stabilize \"Special Types\" Message Pack Type Codes\n\nCurrently, the [msgpack_rpc API documentation](https://neovim.io/doc/user/msgpack_rpc.html#rpc-types) says this about the message pack type codes: \r\n\r\n> Even for statically compiled clients it is good practice to avoid hardcoding\r\nthe type codes, because a client may be built against one Nvim version but\r\nconnect to another with different type codes.\r\n\r\nWhile it is certainly possible to create a statically compiled client which defers knowledge the type-code information until run time, the instability makes the process considerably harder. In particular, the deserialization or serialization of any types that reference buffers, tabpages, or windows must require an open connection to a neovim client. \r\n\r\nIt would be a better experience for those writing static clients for these constants to be fixed, and considered a part of the stable API.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"CNI doesnt work on Windows","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Dropdown: filter: Keypress up/down does not work","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"onChange function does not return/provide parameter with the caller proxy context","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Inconsistent behavior when trying to delete a system message\n\nIf you try to use the single message delete endpoint (https://discordapp.com/developers/docs/resources/channel#delete-message) you get an error which is ``403 FORBIDDEN (error code: 50021): Cannot execute action on a system message``\r\n\r\nBut if you use bulk delete messages (https://discordapp.com/developers/docs/resources/channel#bulk-delete-messages)\r\n\r\nThe request is successful and the message gets deleted, I reproduced this with the news feed follow feature: https://i.imgur.com/D6tKpIc.png\r\n\r\nI'm not sure if you're supposed to be able to delete the message at all or not.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# License and author information\n\nhttps://github.com/Warsow/warsow-assets/blob/master/README.md states:\r\n\r\n> Most assets are provided under CC-BY-SA 4.0 and CC-BY-ND 4.0.\r\n\r\nHow can I tell which asset is licensed under which license and who the author is?\r\n\r\nFor example:\r\nhttps://github.com/Warsow/warsow-assets/blob/master/maps/wrace1.bsp\r\n\r\nIf there is no system yet, it's possible to have one text file per file, one text file per folder, one text file in root per license (as https://github.com/Warsow/warsow-assets/blob/master/assets-non-free.txt seems to do it) or one text file in root for all.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"fix readme","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# open-mpi (ffmpeg dependency) not installing from source on Linux Mint 19.2\n\n**Please note we will close your issue without comment if you delete, do not read or do not fill out the issue checklist below and provide ALL the requested information. If you repeatedly fail to use the issue template, we will block you from ever submitting issues to Homebrew again.**\r\n\r\n- [X] are reporting a bug others will be able to reproduce and not asking a question or requesting software. If you're not sure or want to ask a question do so on our Discourse: https://discourse.brew.sh. To get software added or changed in Homebrew please file a [Pull Request](https://github.com/Homebrew/linuxbrew-core/blob/master/CONTRIBUTING.md)\r\n- [X] have a problem with `brew install` (or `upgrade`, `reinstall`) a single, Homebrew/homebrew-core formula (not cask) on macOS? If it's a general `brew` problem please file this issue at Homebrew/brew: https://github.com/Homebrew/brew/issues/new/choose. If it's a Linux problem please file this issue at https://github.com/Homebrew/linuxbrew-core/issues/new/choose. If it's a `brew cask` problem please file this issue at https://github.com/Homebrew/homebrew-cask/issues/new/choose. If it's a tap (e.g. Homebrew/homebrew-php) problem please file this issue at the tap.\r\n- [X] ran `brew update` and can still reproduce the problem?\r\n- [X] ran `brew doctor`, fixed all issues and can still reproduce the problem?\r\n- [X] ran `brew gist-logs open-mpi` (where `open-mpi` is the name of the formula that failed) and included the output link?\r\n- [ ] if `brew gist-logs` didn't work: ran `brew config` and `brew doctor` and included their output with your issue?\r\n\r\nTo help us debug your issue please explain:\r\n\r\n- What you were trying to do (and why)\r\n Try to install `open-mpi` since it\u2019s a dependency for `ffmpeg` which in turn is a dependency for my `loudgain`.\r\n\r\n- What happened (include command output)\r\n```\r\n==> Downloading https://download.open-mpi.org/release/open-mpi/v4.0/openmpi-4.0.1.tar.bz2\r\nAlready downloaded: /home/matthias/.cache/Homebrew/downloads/06441b124d088edf06929dec0c12aa39897d6b5353863404590fec0fa833abcc--openmpi-4.0.1.tar.bz2\r\n==> ./configure --prefix=/home/linuxbrew/.linuxbrew/Cellar/open-mpi/4.0.1_2 --disable-silent-rules --enable-ipv6 --with-libevent=/home/linuxbrew/.linuxbrew/opt/libevent --with-sge --enable-mpi1-compatibility\r\n==> make all\r\n==> make check\r\nLast 15 lines from /home/matthias/.cache/Homebrew/Logs/open-mpi/03.make:\r\n============================================================================\r\nMakefile:2154: recipe for target 'test-suite.log' failed\r\nmake[4]: *** [test-suite.log] Error 1\r\nmake[4]: Leaving directory '/tmp/open-mpi-20190811-7210-w7afb0/openmpi-4.0.1/test/util'\r\nMakefile:2260: recipe for target 'check-TESTS' failed\r\nmake[3]: *** [check-TESTS] Error 2\r\nmake[3]: Leaving directory '/tmp/open-mpi-20190811-7210-w7afb0/openmpi-4.0.1/test/util'\r\nMakefile:2347: recipe for target 'check-am' failed\r\nmake[2]: *** [check-am] Error 2\r\nmake[2]: Leaving directory '/tmp/open-mpi-20190811-7210-w7afb0/openmpi-4.0.1/test/util'\r\nMakefile:1756: recipe for target 'check-recursive' failed\r\nmake[1]: *** [check-recursive] Error 1\r\nmake[1]: Leaving directory '/tmp/open-mpi-20190811-7210-w7afb0/openmpi-4.0.1/test'\r\nMakefile:1893: recipe for target 'check-recursive' failed\r\nmake: *** [check-recursive] Error 1\r\n\r\nREAD THIS: https://docs.brew.sh/Troubleshooting\r\n\r\nThese open issues may also help:\r\nopen-mpi: Build a bottle for Linuxbrew https://github.com/Homebrew/linuxbrew-core/pull/14719\r\n```\r\n\r\n- What you expected to happen\r\n open-mpi installed cleanly (actually, ffmpeg installed but as requested, tried only _one_ formula).\r\n\r\n- Step-by-step reproduction instructions (by running `brew install` commands)\r\n ```bash\r\n brew install open-mpi\r\n brew install open-mpi --env=std\r\n ```\r\n (Both didn\u2019t work.)\r\n\r\nGist as requested: https://gist.github.com/64675b0cb84cea491d066fd7f91f4e24\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Exe file missing","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2018-11695 (High) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2018-11695 - High Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>\n<p>\n\n<p>A Java based fault and performance management system</p>\n<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nAn issue was discovered in LibSass through 3.5.2. A NULL pointer dereference was found in the function Sass::Expand::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.\n\n<p>Publish Date: 2018-06-04\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695>CVE-2018-11695</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: High\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11695</a></p>\n<p>Release Date: 2018-06-04</p>\n<p>Fix Resolution: 3.6.0</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Cli default output regression","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Server unable to load function with custom command\n\n**CommandAPI version**\r\nCommandAPI 2.2 on Paper-166 (1.14.4)\r\nalso tried it with a Spigot version from [GetBukkit.org](https://getbukkit.org/download/spigot)\r\n\r\n**Describe the bug**\r\nThe server is unable load a mcfunction file with a custom command.\r\nThe command it self works as intended.\r\n\r\n**My code**\r\n(Copied from [documentation](https://jorelali.github.io/1.13-Command-API/functions.html))\r\n```java\r\n@Override\r\n public void onLoad() {\r\n //Commands which will be used in Minecraft functions are registered here\r\n\r\n CommandAPI.getInstance().register(\"killall\", new LinkedHashMap<>(), (sender, args) -> {\r\n //Kills all enemies in all worlds\r\n Bukkit.getWorlds()\r\n .forEach(w -> w.getLivingEntities()\r\n .forEach(e -> e.setHealth(0))\r\n );\r\n });\r\n }\r\n```\r\n\r\n**Error**\r\n\r\n```\r\n[23:32:31 ERROR]: Couldn't load function at test:functions/with.mcfunction\r\njava.util.concurrent.CompletionException: java.lang.IllegalArgumentException: Whilst parsing command on line 1: Unknown command at position 0: <--[HERE]\r\n\tat java.util.concurrent.CompletableFuture.encodeThrowable(CompletableFuture.java:273) ~[?:1.8.0_191]\r\n\tat java.util.concurrent.CompletableFuture.completeThrowable(CompletableFuture.java:280) ~[?:1.8.0_191]\r\n\tat java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:604) ~[?:1.8.0_191]\r\n\tat java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:577) ~[?:1.8.0_191]\r\n\tat java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:443) ~[?:1.8.0_191]\r\n\tat java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) ~[?:1.8.0_191]\r\n\tat java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) ~[?:1.8.0_191]\r\n\tat java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) ~[?:1.8.0_191]\r\n\tat java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:157) ~[?:1.8.0_191]\r\nCaused by: java.lang.IllegalArgumentException: Whilst parsing command on line 1: Unknown command at position 0: <--[HERE]\r\n\tat net.minecraft.server.v1_14_R1.CustomFunction.a(CustomFunction.java:77) ~[patched_1.14.4.jar:git-Paper-166]\r\n\tat net.minecraft.server.v1_14_R1.CustomFunctionData.lambda$a$2(CustomFunctionData.java:168) ~[patched_1.14.4.jar:git-Paper-166]\r\n\tat java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:602) ~[?:1.8.0_191]\r\n\t... 6 more\r\n```\r\n\r\nLog-file:\r\n[latest.log](https://github.com/JorelAli/1.13-Command-API/files/3489311/latest.log)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Status Code documentation\n\nSee: https://github.com/ambergkim/nouri-api/issues/38","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Use eslint specified in `eslint.nodePath` rather than global, for untitled files\n\nFor untitled files, why does it forcefully tries to use the global eslint, rather than the one specified in `eslint.nodePath`?\r\n\r\nInstead, for untitled files, it should still continue to use the eslint specified in `nodePath`, or whatever it uses by default for workspaces.\r\n\r\nI assume there's a technical limitation for this, and if so, then please mention it in the readme, and please suggest a workaround, if one exists. The reason this is a problem for me, is that since upgrading to eslint 6, my plugins cannot be accessed globally, so they are installed in the same dir as my global eslint config file, and now this vscode extension is trying to access the global plugins for untitled files, when I don't have any there anymore.\r\n\r\nI'm also getting `Error: Failed to load config \"airbnb\" to extend from.` when opening JS untitled files, and I've installed `eslint-config-airbnb` locally, globally, in the same project as my global eslint config, etc. but nothing fixes it.\r\n\r\nPer these lines: \r\n\r\nhttps://github.com/microsoft/vscode-eslint/blob/9318950b070c2642fd76f5852ca12e6bee0884aa/client/src/extension.ts#L692-L698","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add an Instructions block to the top of all new projects","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Server Crash - Charcoal docs\n\nWas playing Dungeons, Dragons, and Spaceships. I clicked on the link to open the charcoal entry in the ingame manual from the JEI info tab and the server crashed. The same error occurs when I go into the AA manual and click on the \"Coal Stuff\" link.\r\n\r\nLink to full crash log is https://paste.dimdev.org/qenedahona.mccrash\r\n\r\n\r\njava.lang.NullPointerException\r\n at java.util.regex.Matcher.getTextLength(Matcher.java:1283)\r\n at java.util.regex.Matcher.reset(Matcher.java:309)\r\n at java.util.regex.Matcher.<init>(Matcher.java:229)\r\n at java.util.regex.Pattern.matcher(Pattern.java:1093)\r\n at java.util.Formatter.parse(Formatter.java:2547)\r\n at java.util.Formatter.format(Formatter.java:2501)\r\n at java.util.Formatter.format(Formatter.java:2455)\r\n at java.lang.String.format(String.java:2928)\r\n at net.minecraft.client.resources.Locale.formatMessage(Locale.java:128)\r\n at net.minecraft.client.resources.I18n.format(SourceFile:15)\r\n at de.ellpeck.actuallyadditions.mod.util.StringUtil.localize(StringUtil.java:37)\r\n at de.ellpeck.actuallyadditions.mod.booklet.page.PageCrafting.drawScreenPre(PageCrafting.java:72)\r\n at de.ellpeck.actuallyadditions.mod.booklet.gui.GuiPage.drawScreenPre(GuiPage.java:179)\r\n at de.ellpeck.actuallyadditions.mod.booklet.gui.GuiBooklet.drawScreen(GuiBooklet.java:166)\r\n at net.minecraftforge.client.ForgeHooksClient.drawScreen(ForgeHooksClient.java:396)\r\n at net.minecraft.client.renderer.EntityRenderer.updateCameraAndRender(EntityRenderer.java:1124)\r\n at net.minecraft.client.Minecraft.runGameLoop(Minecraft.java:1119)\r\n at net.minecraft.client.Minecraft.run(Minecraft.java:3942)\r\n at net.minecraft.client.main.Main.main(SourceFile:123)\r\n at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\r\n at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\r\n at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\r\n at java.lang.reflect.Method.invoke(Method.java:497)\r\n at net.minecraft.launchwrapper.Launch.launch(Launch.java:135)\r\n at net.minecraft.launchwrapper.Launch.main(Launch.java:28)\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Design Review 2017-08-16","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Kestrel HTTPS instructions are incomplete (preview 2)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Visibility of the examples is poor","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Update Docs for Container Lifecycle Hooks","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Team discussion of next steps","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Readme","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Alumni dashboard","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Roadmap\n\nHey I saw this on Reddit and wanted to help out. I think the next thing needed is a roadmap with goals and objectives laid out for the project, that way people can help out. \r\n\r\nThings I can think of:\r\n\r\nA list of core widgets to implement\r\nA prioritized list of rendering backends\r\nImprovements and new features for the layout algorithm\r\n\r\nI think some additional documentation laying out the main philosophy and ideas behind the project would be good too. \r\n\r\nI don't really know how much time I will have to work on this but I'd love to help whenever I can, a library like this could be really useful.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Can't compile ocaml-ctypes (missing Dl implementation)\n\nI ran the following:\r\n\r\n`opam install ctypes-foreign`\r\n\r\nmy dune file looks like this:\r\n\r\n```\r\n(library\r\n (name name_goes_here)\r\n (libraries base core_kernel ctypes async)\r\n (inline_tests)\r\n (preprocess\r\n (pps ppx_jane)))\r\n```\r\n\r\nAnd the error message that I get is the following:\r\n\r\n```\r\nNo implementations provided for the following modules:\r\n Dl referenced from name_goes_here.cmxa\r\n```\r\n\r\nI'm building on ubuntu, so I don't understand why it didn't pull in [dl.ml.unix](https://github.com/ocamllabs/ocaml-ctypes/blob/e7063afab825982a3fb9b937ca1c5384ec26f4e4/src/ctypes-foreign-base/dl.ml.unix)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add Users as a supported entity","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"How Do You Start?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"kvSelectColumn is not defined","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"nRF SDK 11 link is wrong","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Updated build instructions for 16.04.1-Ubuntu missing","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Optimiser interface is not documented\n\nThe optimiser api is not documented anywhere. Here are some questions that came up for me:\r\n\r\n- the relationship between `apply!` and `update!` is not clear just based on the names of these methods. It appears that `apply!` updates gradients, while `update!` updates parameters.\r\n- it is not immediately obvious what `apply!` should mutate. Maybe just the optimiser? This is exacerbated by the fact that it returns `\u0394`. Is there a reason to return `\u0394` when it is already being mutated? Some light documentation would quickly dispel any confusion.\r\n- There is no documentation. I looked at the pull request for the optimiser changes as well as the commit message and the code itself. \r\n- Parameters are named `x`. This is not what I would expect, since x usually refers to the features. This could quickly be cleared up with some docstrings. \r\n- It is not immediately clear why parameters are being passed into `apply!`. It seems that we are not allowed to mutate `x`, and it also isn't used by any of the optimisers to compute gradient updates. It would help to document the the lifecycle of the `apply!` method to understand that we need to attach optimiser state to each individual parameter. The reason is that `apply!` can get called many times in the same update step, once for each parameter. Some documentation would be helpful. \r\n- The docstrings are not in sync with the coder, or not appropriate for the methods being described. Example: \"Classic gradient descent optimiser with learning rate `\u03b7`. For each parameter `p` and its gradient `\u03b4p`, this runs `p -= \u03b7*\u03b4p`.\". Names of the parameters don't match the code here, which uses `x` and `\u0394` instead of `p` and `\u03b4p`. Also, `apply!` does not update parameters. This is done somewhere else (in the `update!`).\r\n\r\nI think it might be helpful to explicity write out an interface along with some documentation. Then each Optimiser will need less documentation. Ignoring the fact that `Optimiser` already exists, something like this maybe:\r\n\r\n```julia:\r\n\"\"\"\r\nApply changes to gradients. \r\n\r\nOptimisers are called once for each parameter in each update step. Because of\r\nthis, the parameter `x` is also passed to each `apply!` call. Optimisers can\r\nthen do bookkeeping by namespacing changes to that parameter only, e.g. by\r\nkeeping an internal dictionary of parameters to internal counters.\r\n\"\"\"\r\nabstract type Optimiser; end\r\n\r\n\"\"\"\r\nApplies changes to the gradient.\r\n\r\n# Arguments\r\n- `o`: the optimiser. Keep optimiser state in here and update on apply! as appropriate.\r\n- `x`: the parameter being optimised. It may not be changed by apply.\r\n- `\u0394`: the gradients w.r.t the parameter. Gradients should be mutated by apply using elementwise operations.\r\n\r\n# Returns\r\n- `\u0394`: the updated gradients w.r.t the parameter.\r\n\"\"\"\r\napply!(o::Optimiser, x, \u0394) = error(\"not implemented\")\r\n\r\n\"\"\"\r\n Descent(\u03b7)\r\n\r\nClassic gradient descent optimiser with learning rate `\u03b7`. When applied, the\r\ngiven gradient `\u0394` is mutated to be equal to `\u03b7\u0394`.\r\n\"\"\"\r\nstruct Descent <: Optimiser\r\n eta::Float64\r\nend\r\n\r\nDescent() = Descent(0.1)\r\napply!(o::Descent, x, \u0394) = \u0394 .*= o.eta\r\n```","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Best practice to add a board/CPU?\n\nHello,\r\n\r\nI would like to see if there's additional documentation that I can follow to add a \"customized\" board/CPU. \r\n\r\nAt this point, I'm specifically interested in WiFi interface, where the underlying WiFi component can be either on-chip or onboard. (WiFi driver is likely vended to us in binary, and data sheet is likely incomplete..) With Renode, the hope is to have as little modification in kernel/OS code as possible, so that the emulation result resembles result running on a real piece of hardware. (In a slightly different approach, I guess we could \"mock\" WiFi in kernel/OS whenever running on emulator... ) Appreciate any tip!\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"add video to ReadMe.md","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Quest - update API docs to show Octane and Native Classes\n\nIt's almost time to show Octane-style code samples in the API docs! Right now, our examples use Ember Objects, and we want them to use native classes and Component file co-location instead. Can you help? Here's how! If you have any questions, drop by the `#dev-ember-learning` channel on [Ember Discord](https://api.emberjs.com/ember/release). Keep reading for step by step instructions, a styleguide, and some examples of the kinds of changes we need help with.\r\n\r\n## Step by step instructions\r\n\r\n1. Comment on this issue to say which item you are volunteering for. The checklist of things we need help with is in the comment below. Only volunteer for items that you think you can open a PR for in a week or less.\r\n2. Before you start working, double check to make sure it's not already been PR'd or claimed by someone else.\r\n3. Fork this repo and branch from `master`. If it's your first open source PR ever, check out [these instructions](https://medium.com/@jenweber/your-first-open-source-contribution-a-step-by-step-technical-guide-d3aca55cc5a6) to learn how to fork, branch, etc. or make changes via the GitHub website instead.\r\n4. Locate the file that has the API doc. The easiest way to do this is locate the file on <https://api.emberjs.com>, and click the edit pencil for an API entry or follow the \"defined in\" link. Our docs are written in YUIdoc and they live right next to the JavaScript for the API.\r\n5. Make your changes! EmberObject is still supported, so we need to show both styles. [Here's an example](https://api.emberjs.com/ember/3.12/functions/@ember%2Fservice/inject). In general, you should give Octane examples first, and EmberObject second. In cases where multiple examples are needed, make all of them Octane, and then add just 1 for EmberObject.\r\nIf you don't know Octane syntax yet, don't worry. There's a [Cheatsheet](https://ember-learn.github.io/ember-octane-vs-classic-cheat-sheet/) and [guide to what is different](https://octane-guides-preview.emberjs.com/release/upgrading/editions/)\r\n6. When you make a commit, it should start with `[DOC beta]`. For example `[DOC beta] update component example for Octane`.\r\n7. Open a pull request. Please check the box that says \"allow maintainers to make edits.\"\r\n8. Celebrate and brag about it to all your friends\r\n\r\n## Styleguide\r\n\r\nAll inline documentation is written using YUIDoc. Follow these rules when updating or writing new documentation. This is copied & pasted from [CONTRIBUTING.md](https://github.com/emberjs/ember.js/blob/master/CONTRIBUTING.md#pull-requests)\r\n\r\n- All code blocks must be fenced\r\n- All code blocks must have a language declared\r\n- All code blocks must be valid code for syntax highlighting\r\n- All code blocks should have an empty line before and after\r\n- All examples in code blocks must be aligned\r\n- Use two spaces between the code and the example: `foo(); // result`\r\n- All references to code words must be enclosed in backticks\r\n- Prefer a single space between sentences\r\n- Reference Ember.js as Ember.\r\n- Wrap long markdown blocks > 80 characters\r\n- Don't include blank lines after @param definitions\r\n\r\n## Example\r\n\r\n_Since this example is trying to show Templates and not teaching Component APIs, it's ok to show just Octane. We'll update the Component to use the native class syntax of `@glimmer/component`_ and change the filepaths to reflect co-location (where a component's hbs and js files are both side-by-side in the `components` folder. We also add `this` for component properties that belong to the component. `@` is used in templates when it is assumed that a property is passed in from the parent._\r\n\r\n### Before:\r\n\r\n Templates manage the flow of an application's UI, and display state (through\r\n the DOM) to a user. For example, given a component with the property \"name\",\r\n that component's template can use the name in several ways:\r\n\r\n```app/components/person-profile.js\r\n import Component from '@ember/component';\r\n\r\n export default Component.extend({\r\n name: 'Jill'\r\n });\r\n```\r\n\r\n```app/templates/components/person-profile.hbs\r\n {{name}}\r\n <div>{{name}}</div>\r\n <span data-name={{name}}></span>\r\n```\r\n\r\n### After\r\n\r\n Templates manage the flow of an application's UI, and display state (through\r\n the DOM) to a user. For example, given a component with the property \"name\",\r\n that component's template can use the name in several ways:\r\n\r\n```app/components/person-profile.js\r\nimport Component from '@glimmer/component';\r\n\r\nexport default class PersonProfile extends Component {\r\n name = 'Jill';\r\n}\r\n```\r\n\r\n```app/components/person-profile.hbs\r\n {{this.name}}\r\n <div>{{this.name}}</div>\r\n <span data-name={{this.name}}></span>\r\n```","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"contrib.data.Dataset - doc issue with Dataset.map / tf.py_func in 1.3.0rc0","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Include code snippet for Sequential model Quick Start in wiki","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Can't seem to capture models from the wishlist","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"unable to access website after composer update today","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2018-11693 (High) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2018-11693 - High Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>\n<p>\n\n<p>A Java based fault and performance management system</p>\n<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nAn issue was discovered in LibSass through 3.5.4. An out-of-bounds read of a memory region was found in the function Sass::Prelexer::skip_over_scopes which could be leveraged by an attacker to disclose information or manipulated to read from unmapped memory causing a denial of service.\n\n<p>Publish Date: 2018-06-04\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11693>CVE-2018-11693</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: High\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11693\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11693</a></p>\n<p>Release Date: 2018-06-04</p>\n<p>Fix Resolution: 3.5.5</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation with examples","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# how do I tell hipSYCL where to find CUDA?\n\nI installed CUDA from Apt so it's in `/usr` not `/usr/local` as LLVM seems to want. Is there a CMake command to tell hipSYCL to pass the right flags to Clang? I am surprised that Clang can't figure out how to use `/usr`. Thanks.\r\n\r\n```\r\n~/Work/OpenCL/hipSYCL/build$ rm -rf * ; cmake -DCMAKE_INSTALL_PREFIX=/opt/sycl/hipsycl .. && make -k\r\n-- The C compiler identification is GNU 7.4.0\r\n-- The CXX compiler identification is GNU 7.4.0\r\n-- Check for working C compiler: /usr/bin/cc\r\n-- Check for working C compiler: /usr/bin/cc -- works\r\n-- Detecting C compiler ABI info\r\n-- Detecting C compiler ABI info - done\r\n-- Detecting C compile features\r\n-- Detecting C compile features - done\r\n-- Check for working CXX compiler: /usr/bin/c++\r\n-- Check for working CXX compiler: /usr/bin/c++ -- works\r\n-- Detecting CXX compiler ABI info\r\n-- Detecting CXX compiler ABI info - done\r\n-- Detecting CXX compile features\r\n-- Detecting CXX compile features - done\r\n-- Looking for pthread.h\r\n-- Looking for pthread.h - found\r\n-- Looking for pthread_create\r\n-- Looking for pthread_create - not found\r\n-- Looking for pthread_create in pthreads\r\n-- Looking for pthread_create in pthreads - not found\r\n-- Looking for pthread_create in pthread\r\n-- Looking for pthread_create in pthread - found\r\n-- Found Threads: TRUE \r\n-- Found CUDA: /usr (found version \"9.1\") \r\n-- Boost version: 1.66.0\r\n-- Found the following Boost libraries:\r\n-- filesystem\r\n-- system\r\n-- Boost version: 1.66.0\r\n-- Configuring done\r\n-- Generating done\r\n-- Build files have been written to: /home/jrhammon/Work/OpenCL/hipSYCL/build\r\nScanning dependencies of target hipSYCL_cpu\r\n[ 2%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/application.cpp.o\r\n[ 5%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/device.cpp.o\r\n[ 8%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/device_selector.cpp.o\r\n[ 11%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/exception.cpp.o\r\n[ 14%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/queue.cpp.o\r\n[ 17%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/handler.cpp.o\r\n[ 20%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/buffer.cpp.o\r\n[ 22%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/task_graph.cpp.o\r\n[ 25%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/accessor.cpp.o\r\n[ 28%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/async_worker.cpp.o\r\n[ 31%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cpu.dir/local_memory.cpp.o\r\n[ 34%] Linking CXX shared library libhipSYCL_cpu.so\r\n[ 34%] Built target hipSYCL_cpu\r\nScanning dependencies of target hipSYCL_cuda\r\n[ 37%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/application.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:62: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/application.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/application.cpp.o] Error 1\r\n[ 40%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/device.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:75: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/device.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/device.cpp.o] Error 1\r\n[ 42%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/device_selector.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:88: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/device_selector.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/device_selector.cpp.o] Error 1\r\n[ 45%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/exception.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:101: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/exception.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/exception.cpp.o] Error 1\r\n[ 48%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/queue.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:114: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/queue.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/queue.cpp.o] Error 1\r\n[ 51%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/handler.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:127: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/handler.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/handler.cpp.o] Error 1\r\n[ 54%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/buffer.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:140: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/buffer.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/buffer.cpp.o] Error 1\r\n[ 57%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/task_graph.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:153: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/task_graph.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/task_graph.cpp.o] Error 1\r\n[ 60%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/accessor.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:166: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/accessor.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/accessor.cpp.o] Error 1\r\n[ 62%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/async_worker.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:179: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/async_worker.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/async_worker.cpp.o] Error 1\r\n[ 65%] Building CXX object src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/local_memory.cpp.o\r\nclang: error: cannot find libdevice for sm_52. Provide path to different CUDA installation via --cuda-path, or pass -nocudalib to build without linking with libdevice.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nclang: error: cannot find CUDA installation. Provide its path via --cuda-path, or pass -nocudainc to build without CUDA includes.\r\nsrc/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/build.make:192: recipe for target 'src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/local_memory.cpp.o' failed\r\nmake[2]: *** [src/libhipSYCL/CMakeFiles/hipSYCL_cuda.dir/local_memory.cpp.o] Error 1\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Switch react-native-vector-icons to peerDependency","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Misleading definition of GPa conversion factor","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"postinstall should determine prefix from spec not ARGV","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update Images Docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"How to store file on remote server uploaded via DMS File Field","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Help me be a better debugger? ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Problem getting jest to work with vue and webpack","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Update README after testing\n\n@vsutinah tried to merge the branch into master","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"[RFC][DX] New command to debug form types","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# noobmaster69\n\n**Before you start, please follow this format for your issue title**: \r\nNOOBMASTER69 - Leap\r\n\r\n## \u2139\ufe0f Project information\r\n_Please complete all applicable._\r\n\r\n- **Leap**:\r\n- **Complete package for mental health and recreational activities focused on homemaker women. \r\n- **Team Name**:\r\n- **Noobmaster69** - vkartik2k, mukulsoni29, Mihir-Rajora20\r\n- **Demo Link**: _(if any, this might contain a website/ mobile application link/ short video, etc.)_\r\n- **Repository Link**: https://github.com/vkartik2k/cftHacks\r\n- **Labels**: Node.js, jquery, bootstrap, python, tensorflow, flask \r\n\r\n## \ud83d\udd25 Your Pitch\r\n_Kindly write a [pitch](https://medium.com/next-media-accelerator/pitch-your-hackathon-product-in-3-minutes-and-conquer-the-jury-9f86bfbdba6f) for your project. Please do not use more than 500 words_\r\n\r\nhttps://docs.google.com/presentation/d/1jDOFDfcAN2miODhqHyhNpc1CpnfGLJB-mKvs3XuYmYw/edit\r\n\r\n\r\n## \ud83d\udd26 Any other specific thing you want to highlight?\r\nPlease read the readme.md\r\n\r\n## \u2705 Checklist\r\n\r\n**Before you post the issue**:\r\n- [x] You have followed the issue title format.\r\n- [x] You have mentioned the correct labels.\r\n- [x] You have provided all the information correctly.\r\n- [x] You have uploaded the pitch deck to the given Google Drive\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add a link to file an issue","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Can't figure out how to proceed in the installation ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# missing dependencies in creact-react-app project\n\nwhen I tape `yarn check` in my react.js project it appears this error\r\n```\r\nyarn check v1.16.0\r\ninfo [email protected]: The platform \"linux\" is incompatible with this module.\r\ninfo \"[email protected]\" is an optional dependency and failed compatibility check. Excluding it from installation.\r\ninfo [email protected]: The platform \"linux\" is incompatible with this module.\r\ninfo \"[email protected]\" is an optional dependency and failed compatibility check. Excluding it from installation.\r\nwarning \"react-scripts#babel-jest@^24.8.0\" could be deduped from \"24.8.0\" to \"[email protected]\"\r\nwarning \"react-scripts#babel-preset-react-app#@babel/[email protected]\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-preset-react-app#@babel/[email protected]\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-preset-react-app#@babel/[email protected]\" could be deduped from \"7.0.0\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-jest#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-loader#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-plugin-named-asset-import#@babel/core@^7.1.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"jest-resolve#jest-pnp-resolver#jest-resolve@*\" could be deduped from \"24.8.0\" to \"[email protected]\"\r\nwarning \"webpack#chrome-trace-event#tslib@^1.9.0\" could be deduped from \"1.10.0\" to \"[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-class-properties#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-decorators#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-object-rest-spread#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-syntax-dynamic-import#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-classes#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-destructuring#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-flow-strip-types#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-react-display-name#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-runtime#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-async-generator-functions@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-json-strings@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-optional-catch-binding@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-unicode-property-regex@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-syntax-async-generators@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-syntax-optional-catch-binding@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-arrow-functions@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-async-to-generator@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-block-scoped-functions@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-block-scoping@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-computed-properties@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-dotall-regex@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-duplicate-keys@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-exponentiation-operator@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-for-of@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-function-name@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-literals@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-member-expression-literals@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-modules-amd@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-modules-commonjs@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-modules-systemjs@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-modules-umd@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-named-capturing-groups-regex@^7.4.2\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-new-target@^7.4.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-object-super@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-parameters@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-property-literals@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-regenerator@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-reserved-words@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-spread@^7.2.0\" could be deduped from \"7.2.2\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-template-literals@^7.2.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-typeof-symbol@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-transform-unicode-regex@^7.4.3\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-display-name@^7.0.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx@^7.0.0\" could be deduped from \"7.3.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx-self@^7.0.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx-source@^7.0.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-typescript#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"react-scripts#babel-jest#babel-preset-jest#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"@typescript-eslint/eslint-plugin#tsutils#tslib@^1.8.1\" could be deduped from \"1.10.0\" to \"[email protected]\"\r\nwarning \"eslint#inquirer#rxjs#tslib@^1.9.0\" could be deduped from \"1.10.0\" to \"[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-class-properties#@babel/helper-create-class-features-plugin#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/preset-env#@babel/plugin-proposal-async-generator-functions#@babel/plugin-syntax-async-generators@^7.2.0\" could be deduped from \"7.2.0\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-proposal-decorators#@babel/plugin-syntax-decorators#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-flow-strip-types#@babel/plugin-syntax-flow#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"@babel/preset-react#@babel/plugin-transform-react-display-name#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nerror \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx-self\" not installed\r\nerror \"babel-preset-react-app#@babel/preset-react#@babel/plugin-transform-react-jsx-source\" not installed\r\nwarning \"babel-preset-react-app#@babel/preset-typescript#@babel/plugin-transform-typescript#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"babel-preset-react-app#@babel/plugin-transform-typescript#@babel/plugin-syntax-typescript#@babel/core@^7.0.0-0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nwarning \"jest-config#babel-jest#@babel/core@^7.0.0\" could be deduped from \"7.4.4\" to \"@babel/[email protected]\"\r\nerror \"babel-jest#babel-preset-jest\" not installed\r\ninfo Found 68 warnings.\r\nerror Found 3 errors.\r\ninfo Visit https://yarnpkg.com/en/docs/cli/check for documentation about this command.\r\n```\r\nI attempted to fix it with `yarn install` but nothing change and show that they all up-to-date\r\n```\r\nyarn install v1.16.0\r\n[1/4] Resolving packages...\r\nsuccess Already up-to-date.\r\nDone in 0.85s.\r\n```\r\nI can't understand what's the reason that causes this and how to fix it.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Does all-subs download only one subtitle per language?\n\n<!--\r\n\r\n######################################################################\r\n WARNING!\r\n IGNORING THE FOLLOWING TEMPLATE WILL RESULT IN ISSUE CLOSED AS INCOMPLETE\r\n######################################################################\r\n\r\n-->\r\n\r\n\r\n## Checklist\r\n\r\n<!--\r\nCarefully read and work through this check list in order to prevent the most common mistakes and misuse of youtube-dl:\r\n- Look through the README (http://yt-dl.org/readme) and FAQ (http://yt-dl.org/faq) for similar questions\r\n- Search the bugtracker for similar questions: http://yt-dl.org/search-issues\r\n- Finally, put x into all relevant boxes (like this [x])\r\n-->\r\n\r\n- [x] I'm asking a question\r\n- [x] I've looked through the README and FAQ for similar questions\r\n- [x] I've searched the bugtracker for similar questions including closed ones\r\n\r\n\r\n## Question\r\n\r\n<!--\r\nAsk your question in an arbitrary form. Please make sure it's worded well enough to be understood, see https://github.com/ytdl-org/youtube-dl#is-the-description-of-the-issue-itself-sufficient.\r\n-->\r\n\r\nIf a video has multiple subtitile formats for a single language, would the --all-subs retrieve just one subtitle per language or all of them?\r\n\r\nReason for asking:\r\nhttps://www.sbs.com.au/ondemand/video/11821123796/hestons-feasts-victorian contains 2 formats: srt and dfxp. --list-subs confirms both are present.\r\nHowever --all-subs would only download the srt. I can override that using --sub-format dfxp, so the dfxp file is available, but I do not understand why it would not be retrieved with --all-subs.\r\n\r\nThis may be a site broken issue, but I have not found another site with more than one format per language, so I am not able to confirm whether it is a bug, a site broken issue or a by-design choice.\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Spigot 1.11.2 Invalid plugin.yml","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Podfile warning code on iOS 11 version but not on iOS 12 version Readme\n\nThe Podfile 'CLANG_WARN_DOCUMENTATION_COMMENTS' that is referenced in the video is only on the iOS 11 version not the iOS 12 version. Not sure if that is intended","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Overview of deprecated stuff","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Catch errors with sentry","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Extend a shareable config from another package - do not support npm3 dir structure","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"azure_sd_configs api change","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Alsatian printing \"plete plete plete\" when running in CI mode\n\nWhen alsatian runs in CI mode (Travis), it prints the word \"plete\". I don't know what it means.\r\n\r\nExample: https://travis-ci.org/jhm-ciberman/docs_gm/jobs/570427708#L242 (Line 242 onwards) \r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# What is the purpose of '1' in SELECT COUNT(1) ...\n\nPlease provide more details.\n\n---\n#### Document Details\n\n\u26a0 *Do not edit this section. It is required for docs.microsoft.com \u279f GitHub issue linking.*\n\n* ID: 8dbaec6a-4f7e-736b-2e6a-c3f6a5529221\n* Version Independent ID: 18fc80ab-e47e-9042-43e4-88d721f6ba57\n* Content: [Aggregate functions in Azure Cosmos DB](https://docs.microsoft.com/en-us/azure/cosmos-db/sql-query-aggregates)\n* Content Source: [articles/cosmos-db/sql-query-aggregates.md](https://github.com/Microsoft/azure-docs/blob/master/articles/cosmos-db/sql-query-aggregates.md)\n* Service: **cosmos-db**\n* GitHub Login: @markjbrown\n* Microsoft Alias: **mjbrown**","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Flags doc error\n\nIn the docs for Flags, it lists the abbrev for ShiftTabs as ST, but it should be SHT.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Document v1.0 features","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"docs.openhab.org/configuration/services.html is a broken link","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Suggestion: create a `pkgdown` style website","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"player.flvjs is not a function","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Explanation for pg:outliers headers","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Possible typo in query","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# BCD for text-decoration-skip\n\nIn https://developer.mozilla.org/en-US/docs/Web/CSS/text-decoration-skip, the BCD says that Safari only supports the none and skip, but skip is not a value.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"readme","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"last release Phoenix 1.3 RC 3...isn't support?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation add on request","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"TODOs","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Changes to the bgprocess","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Missing public key for apt-get update","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# profiling notes\n\n`node --prof ./build/src/cli \"../abapGit/src/**/*.*\" -s`\r\n`node --prof-process isolate-0xnnnnnnnnnnnn-v8.log > processed.txt`\r\nhttps://mapbox.github.io/flamebearer/\r\nhttps://clinicjs.org/documentation/","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# TODO List for v1.3.6\n\nThis is a TODO list for next release v1.3.6. Because my device was physically damaged and it was sent back to my homeland for repairing, I may not contribute more to this repository. I hope someone can take over this repository and finish the following things:\r\n\r\n- [ ] Update Lilu and its plugins (only use release version)\r\n - No SMCLightSensor, SuperIO, VoodooPS2Mouse, and VoodooPS2Trackpad\r\n - Modify VoodooPS2 as https://github.com/daliansky/XiaoMi-Pro-Hackintosh/commit/2a3739eea1e97babc6928a1237a281c0f47375f3 and https://github.com/daliansky/XiaoMi-Pro-Hackintosh/commit/a4309ba4d1cda06db37f7b48632e648d3d17e877\r\n- [ ] Update OpenCore and its config.plist (only use release version; config refers to https://github.com/acidanthera/OpenCorePkg/blob/master/Docs/Sample.plist)\r\n- [ ] Update Clover\r\n - Keep CLOVER/drivers/UEFI/ the same, it is already up-to-date\r\n - No new folders, keep directory clean\r\n- [ ] Inject Clover's default SerialNumber and BoardSerialNumber to OpenCore's config \r\n - Clover's SerialNumber = PlatformInfo - Generic - SystemSerialNumber, \r\nClover's BoardSerialNumber = PlatformInfo - Generic - MLB)\r\n- [ ] Add `complete-modeset-framebuffers` property to enforce complete IGPU modeset on con1; otherwise, the internal screen turns black after connecting the left HDMI port\r\n\r\nATTENTION: Never use `Clover Configurator` or `OC Configurator` to edit config.plist. They will mess up the format.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# config variable $config['des_key'] is not individual when it is not configured for a container \n\nthis is a security issue! In a running roundcube installation this has to be configured individiually. \r\n\r\n```\r\n// this key is used to encrypt the users imap password which is stored\r\n// in the session record (and the client cookie if remember password is enabled).\r\n// please provide a string of exactly 24 chars.\r\n// YOUR KEY MUST BE DIFFERENT THAN THE SAMPLE VALUE FOR SECURITY REASONS\r\n\r\n```\r\nhowever: \r\n\r\n```\r\nroot@c62c0e0af57f:/var/www/html/config# grep des *\r\nconfig.inc.php.sample:$config['des_key'] = 'rcmail-!24ByteDESkey*Str';\r\ndefaults.inc.php:// Includes should be interpreted as PHP files\r\ndefaults.inc.php:$config['des_key'] = 'rcmail-!24ByteDESkey*Str';\r\ndefaults.inc.php: 'notes' => 'description',\r\ndefaults.inc.php:// Interface layout. Default: 'widescreen'.\r\ndefaults.inc.php:// 'widescreen' - three columns\r\ndefaults.inc.php:// 'desktop' - two columns, preview on bottom\r\ndefaults.inc.php:$config['layout'] = 'widescreen';\r\nmimetypes.php: 'ppsm' => 'application/vnd.ms-powerpoint.slideshow.macroEnabled.12',\r\nmimetypes.php: 'ppsx' => 'application/vnd.openxmlformats-officedocument.presentationml.slideshow',\r\n```\r\n\r\nlooks very samplish to me. \r\n\r\nI would have expected the docker image to at least create a des_key if no key is provided by the administrator. I would have also expected that this switch is documented on the dockerhub site and in the README.md to this container, as this is the place peope are supposed to look to get their stuff running. \r\n\r\nWorkaround: I am going to place a config.php into the container via compose.yml to configure that key. ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"outdated example with Yahoo Finance API","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Dev test","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"AddToCart method relies on the posted productDetails","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Signals caused by collisions should emit at the end of the physics step, instead of just before the next physics step\n\n<!-- Please search existing issues for potential duplicates before filing yours:\r\nhttps://github.com/godotengine/godot/issues?q=is%3Aissue\r\n-->\r\n\r\n**Godot version:**\r\nv3.1.1.stable.official\r\n\r\n**OS/device including version:**\r\nWindows 10\r\n\r\n**Issue description:**\r\nI was studying the game loop in Godot and what defines a \"frame\", and came across a potential issue.\r\n\r\nAs I understand it, _physics_process occurs at an FPS defined in your project settings, and _process occurs either at the rate of your monitor's vsync, or as fast as possible.\r\n\r\n_process will occur when a frame is going to be drawn, while _physics_process simulates the world at a regular rate. According to the documentation, if both are due (which can happen if the Physics FPS is set to 60, and vysnc is on and set to 60 FPS), then _physics_process will happen first, followed by the physics step, followed by _process, followed by the frame drawing.\r\n\r\nOK so far, but if a collision or overlap occurs during the physics step, I would like to see the signals for those collisions be emitted right after the physics step finishes. I attached a project where I set the Physics FPS to 5, and it seems that instead, the signals for a collision are emitted just before the beginning of the next _physics_process call. \r\n\r\nIn the project, the moving rectangle should turn red when it overlaps the non-moving rectangle. There are several frames where the boxes are visibly overlapping, but not reacting. If the area_entered/exited signals were emitted right after the physics step, this could be avoided.\r\n\r\nI know I'm exaggerating the problem in this example with such a low Physics FPS. But, it's good to see game events happening \"the same frame\" - for instance, if you spawned an object in your world, and it didn't visibly react to anything in the first 1/60th second of its life, sharp-eyed players might consider it a glitch or a lack of polish.\r\n\r\n**Minimal reproduction project:**\r\n[CollisionSignals.zip](https://github.com/godotengine/godot/files/3489279/CollisionSignals.zip)\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# The iteration during training is always 0\n\nDear author,\r\n\r\nThank you for your efforts in putting the source code online. I had a problem when recreating PCC-RL. The built environment allows the client and server to ping, according to the steps of DeepLearning_Readme.md. When using shim_solver training, iter is always 0, can't interact with the environment. Where should I view this problem?\r\nI look forward to your answer. Thank you.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add document strings for constants","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Create diagnostics for issues found in an ONBUILD's trigger instruction","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2019-11358 (Medium) detected in jquery-3.3.1.min.js\n\n## CVE-2019-11358 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-3.3.1.min.js</b></p></summary>\n\n<p>JavaScript library for DOM operations</p>\n<p>Library home page: <a href=\"https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js\">https://cdnjs.cloudflare.com/ajax/libs/jquery/3.3.1/jquery.min.js</a></p>\n<p>Path to dependency file: /website/docs/nucleo-icons.html</p>\n<p>Path to vulnerable library: /website/docs/nucleo-icons.html,/website/docs/examples/../assets/js/core/jquery.min.js,/website/docs/./assets/js/core/jquery.min.js</p>\n<p>\n\nDependency Hierarchy:\n - :x: **jquery-3.3.1.min.js** (Vulnerable Library)\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \njQuery before 3.4.0, as used in Drupal, Backdrop CMS, and other products, mishandles jQuery.extend(true, {}, ...) because of Object.prototype pollution. If an unsanitized source object contained an enumerable __proto__ property, it could extend the native Object.prototype.\n\n<p>Publish Date: 2019-04-20\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358>CVE-2019-11358</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Changed\n- Impact Metrics:\n - Confidentiality Impact: Low\n - Integrity Impact: Low\n - Availability Impact: None\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11358</a></p>\n<p>Release Date: 2019-04-20</p>\n<p>Fix Resolution: 3.4.0</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"When and how to use build-tests.sh and run-test.sh","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"runing django TemplateDoesNotExist","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"List the Permissions Required to Create Wrapped Token","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":" Error in generator gui interface when clicking generate","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"gib-da-repo-baws","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Functional examples","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"https://angular.io/guide/webpack Missing class view-SideNav in aio-shell","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"add me","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Display X Annotation labels horizontally\n\nI'm attempting to display my annotation labels horizontally when using `x_annotation`. By default they show the text and label vertically. When looking at the apexcharts documentation I noticed an option for this, but when trying it with this gem my labels just disappear completely.\r\n\r\nHere's the line of code I used... `<% x_annotation(value: eos_weight.date, text: \"#{eos_weight.weigh_in_value}\", color: 'purple', label: { orientation: \"horizontal\" }) %>`\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Compilation fails due to deny(warnings)\n\n```rust\r\nerror: missing documentation for macro\r\n --> src/macros.rs:106:1\r\n |\r\n106 | macro_rules! sudo_io_static_fn {\r\n | ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\r\n |\r\nnote: lint level defined here\r\n --> src/lib.rs:21:9\r\n |\r\n21 | #![deny(warnings)]\r\n | ^^^^^^^^\r\n = note: #[deny(missing_docs)] implied by #[deny(warnings)]\r\n\r\nerror: missing documentation for macro\r\n --> src/macros.rs:182:1\r\n |\r\n182 | macro_rules! sudo_io_fn {\r\n | ^^^^^^^^^^^^^^^^^^^^^^^\r\n\r\nerror: aborting due to 2 previous errors\r\n\r\nerror: Could not compile `sudo_plugin`.\r\n```\r\n\r\nUsing `deny(warnings)` is really bad idea since software would stop compiling after compiler update.\r\n\r\nPlease use `-D warnings` in CI instead.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Video tutorial","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Bindings not working properly after replacing FormArray using setControl() and adding new controls","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation not found","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"trouble running Oracle_cx on macOS","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[bug] Background images break source","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# [IDE docs] Jetbrains (IntelliJ, Webstorm) syntax highlighting\n\nIt would be great to have some docs on how to add html/css in template literals syntax highlighting for the Jetbrains IDEs. Does anyone know how to do this?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"clarification about generated bound methods","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"\"csc.exe\" exited with code -532462766.\t","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Cleaned-up version of QFusionStyle\n\nGood job with this phantomstyle Andrew, there's not many QStyle floating around. I'm undortunately still stuck with Qt4.8 so I can't really play with phantomstyle.\r\n\r\nI was wondering if there's anywhere where you commited your initial fixes to QFusionStyle, for us to learn what you cleaned up (I'm referring to the wrong sizeing, useless repaint and so on you describe in the readme)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Move.from_uci will raise IndexError when passed a zero length string","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[BUG] Decorated classes not included in the document","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Installing Realm Node.JS SDK professional edition - Error: Cannot find module '../compiled/linux-x64/node-v57/realm-node.node'","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Using ISO padding?\n\nThe README notes state that you can use ISO7816-4 as a padding. How would this be achieved with AES?\r\nMy code currently looks like this to encrypt:\r\n```dart\r\nvar key = utf8.encode(key32);\r\nvar ivLocal = utf8.encode(iv);\r\nCipherParameters params = PaddedBlockCipherParameters(ParametersWithIV(KeyParameter(key), ivLocal), null);\r\nPaddedBlockCipherImpl cipherImpl = PaddedBlockCipherImpl(Padding('ISO7816-4'), CBCBlockCipher(AESFastEngine()));\r\ncipherImpl.init(true, params);\r\nreturn cipherImpl.process(utf8.encode(input));\r\n```\r\nI know this package is mostly inactive, but could anyone help me out?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Nil id_token passed\n\n### **Description**\r\n\r\nI have followed all the documentation for creating a Xamarin Forms app to use App Center Auth and after signing-up/signing-in I receive the following exception:\r\n\r\n```\r\nUser sign-in failed. Error: Error Domain=MSALErrorDomain Code=-50000 \"(null)\" UserInfo={MSALErrorDescriptionKey=Nil id_token passed, MSALInternalErrorCodeKey=-42600}\r\n```\r\n\r\nIt looks as though Azure AD is logging the user in correctly but I never receive any user information after calling:\r\n\r\n```\r\nvar userInfo = await Auth.SignInAsync();\r\n```\r\n\r\nHas anyone else seen this error before?\r\n\r\n### **Repro Steps**\r\n\r\nPlease list the steps used to reproduce your issue.\r\n\r\n1. Follow steps for setting up Auth in App Center\r\n2. Add example code\r\n3. Sign-up\r\n\r\n### **Details**\r\n\r\n1. What is your app platform (Xamarin.Android or Xamarin.iOS or UWP)?\r\n - Xamarin.Forms on iOS\r\n2. If using Xamarin.Forms or if using portable/shared code to call our SDK APIs, are you using shared project, PCL code or .NET standard code for the application? Which .NET standard version or which PCL profile?\r\n - .NET standard 2.0.\r\n2. Which SDK version are you using?\r\n - 2.1.0\r\n4. Which OS version did you experience the issue on?\r\n - iOS 12.2\r\n5. What device version did you see this error on? Were you using an emulator or a physical device?\r\n - iPhone Simulator\r\n6. What third party libraries are you using?\r\n - Microsoft.AppCenter\r\n7. Please enable verbose logging for your app using `AppCenter.LogLevel = LogLevel.Verbose` before your call to `AppCenter.Start(...)` and include the logs here:\r\n\r\n[AppCenter] VERBOSE: -[MSDelegateForwarder addTraceBlock:]_block_invoke_2/88 Start buffering traces.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder setEnabledFromPlistForKey:]_block_invoke/278 Delegate forwarder for info.plist key 'AppCenterAppDelegateForwarderEnabled' enabled. It may use swizzling.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder setEnabledFromPlistForKey:]_block_invoke/278 Delegate forwarder for info.plist key 'AppCenterUserNotificationCenterDelegateForwarderEnabled' enabled. It may use swizzling.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder swizzleOriginalSelector:withCustomSelector:originalClass:]_block_invoke/205 Selector 'setDelegate:' of class 'UNUserNotificationCenter' is swizzled.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder swizzleOriginalSelector:withCustomSelector:originalClass:]_block_invoke/205 Selector 'setDelegate:' of class 'UIApplication' is swizzled.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder swizzleOriginalSelector:withCustomSelector:originalClass:]_block_invoke/205 Selector 'application:didRegisterForRemoteNotificationsWithDeviceToken:' of class 'AppDelegate' is swizzled.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder swizzleOriginalSelector:withCustomSelector:originalClass:]_block_invoke/205 Selector 'application:didReceiveRemoteNotification:' of class 'AppDelegate' is swizzled.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder swizzleOriginalSelector:withCustomSelector:originalClass:]_block_invoke/205 Selector 'application:didFailToRegisterForRemoteNotificationsWithError:' of class 'AppDelegate' is swizzled.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder swizzleOriginalSelector:withCustomSelector:originalClass:]_block_invoke/205 Selector 'application:didReceiveRemoteNotification:fetchCompletionHandler:' of class 'AppDelegate' is swizzled.\r\n[AppCenter] DEBUG: -[MSDelegateForwarder swizzleOriginalSelector:withCustomSelector:originalClass:]_block_invoke/205 Selector 'application:openURL:options:' of class 'AppDelegate' is swizzled.\r\n[AppCenter] VERBOSE: +[MSDelegateForwarder flushTraceBuffer]/106 Stop buffering traces, flushed.\r\n[AppCenter] DEBUG: -[MSSessionContext init]/42 1 session(s) in the history.\r\n[AppCenter] VERBOSE: -[MSSessionContext setSessionId:]/66 Stored new session with id:(null) and timestamp: 2019-08-10 22:19:36 +0000.\r\n[AppCenter] DEBUG: -[MSUserIdContext init]/54 1 userId(s) in the history.\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenter] DEBUG: -[MSDBStorage initWithSchema:version:filename:]_block_invoke/29 SQLite global configuration successfully updated.\r\n[AppCenter] INFO: -[MSAppCenter configureWithAppSecret:transmissionTargetToken:fromApplication:]/281 App Center SDK configured successfully.\r\n[AppCenter] INFO: -[MSDBStorage initWithSchema:version:filename:]/60 Migrating \"Documents.sqlite\" database from version 0 to 1.\r\n[AppCenter] DEBUG: +[MSDBStorage enableAutoVacuumInOpenedDatabase:]/229 Vacuuming database and enabling auto_vacuum\r\n[AppCenter] VERBOSE: -[MSAppCenter start:withServices:fromApplication:]/296 Start services MSCrashes, MSAnalytics, MSAuth, MSData, MSPush from an application\r\n[AppCenterCrashes] DEBUG: -[MSCrashes configureCrashReporterWithUncaughtExceptionHandlerEnabled:]/563 EnableUncaughtExceptionHandler is set to NO, we're running in a Xamarin runtime.\r\n[AppCenterCrashes] DEBUG: -[MSCrashes configureCrashReporterWithUncaughtExceptionHandlerEnabled:]/606 Exception handler successfully initialized but it has not been registered due to the wrapper SDK.\r\n[AppCenter] VERBOSE: -[MSSessionContext clearSessionHistoryAndKeepCurrentSession:]/88 Cleared old sessions.\r\n[AppCenter] VERBOSE: -[MSUserIdContext clearUserIdHistory]/124 Cleared old userIds while keeping current userId.\r\n[AppCenterCrashes] INFO: -[MSCrashes applyEnabledState:]/331 Crashes service has been enabled.\r\n[AppCenterCrashes] VERBOSE: -[MSCrashes startWithChannelGroup:appSecret:transmissionTargetToken:fromApplication:]/370 Started crash service.\r\n[AppCenter] VERBOSE: -[MSChannelUnitDefault resumeWithIdentifyingObjectSync:]/551 Identifying object <MSAnalytics: 0x6000035c95f0> removed from pause lane for channel Analytics.\r\n[AppCenter] DEBUG: -[MSChannelUnitDefault resumeWithIdentifyingObjectSync:]/553 Resume channel Analytics.\r\n[AppCenter] VERBOSE: -[MSChannelUnitDefault resumeWithIdentifyingObjectSync:]/551 Identifying object <MSAnalytics: 0x6000035c95f0> removed from pause lane for channel Analytics/one.\r\n[AppCenter] VERBOSE: -[MSSessionContext setSessionId:]/66 Stored new session with id:BE8C4F4A-87D6-4FAE-AEF1-C60E16BCC38B and timestamp: 2019-08-10 22:19:36 +0000.\r\n[AppCenter] DEBUG: -[MSChannelUnitDefault resumeWithIdentifyingObjectSync:]/553 Resume channel Analytics/one.\r\n[AppCenterAnalytics] INFO: -[MSSessionTracker renewSessionId]/49 New session ID: BE8C4F4A-87D6-4FAE-AEF1-C60E16BCC38B\r\n[AppCenterCrashes] VERBOSE: -[MSCrashes channel:didPrepareLog:internalId:flags:]/440 Storing a log to Crashes Buffer: (sid: BE8C4F4A-87D6-4FAE-AEF1-C60E16BCC38B, type: startSession)\r\n[AppCenterCrashes] VERBOSE: -[MSCrashes channel:didPrepareLog:internalId:flags:]/451 Found an empty buffer position.\r\n[AppCenterAnalytics] INFO: -[MSAnalytics applyEnabledState:]/154 Analytics service has been enabled.\r\n[AppCenter] DEBUG: -[MSChannelUnitDefault enqueueItem:flags:]_block_invoke/187 Saving log, type: startSession, flags: 1.\r\n[AppCenterAnalytics] VERBOSE: -[MSAnalytics startWithChannelGroup:appSecret:transmissionTargetToken:fromApplication:]/106 Started Analytics service.\r\n[AppCenter] VERBOSE: -[MSLogDBStorage saveLog:withGroupId:flags:]_block_invoke/120 Log is stored with id: '51'\r\n[AppCenterCrashes] VERBOSE: -[MSCrashes channel:didCompleteEnqueueingLog:internalId:]/493 Deleting a log from buffer with id AB5581AD-B659-4555-B07E-C718E4CDB37F\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenterAuth] VERBOSE: -[MSAuthConfigIngestion createRequest:eTag:authToken:]/50 URL: https://config.appcenter.ms/auth/****************************bdab465a.json\r\n[AppCenterAuth] VERBOSE: -[MSAuthConfigIngestion createRequest:eTag:authToken:]/52 Headers: If-None-Match = 0x8D71DACE81D5ADF\r\n[AppCenterAuth] INFO: -[MSAuth applyEnabledState:]/120 Auth service has been enabled.\r\n[AppCenterAuth] VERBOSE: -[MSAuth startWithChannelGroup:appSecret:transmissionTargetToken:fromApplication:]/78 Started Auth service.\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenter] INFO: -[MSHttpIngestion networkStateChanged]/356 Internet connection is up.\r\n[AppCenterData] INFO: -[MSData networkStateChanged:]/778 Network connection is on.\r\n[AppCenter] WARNING: -[MSAuthTokenContext authTokenHistory]/210 Failed to retrieve history state or none was found.\r\n[AppCenter] INFO: -[MSHttpClient networkStateChanged:]/202 Internet connection is up.\r\n[AppCenterData] VERBOSE: -[MSData startWithChannelGroup:appSecret:transmissionTargetToken:fromApplication:]/712 Started Data service.\r\n[AppCenterPush] VERBOSE: -[MSPush registerForRemoteNotifications]/217 Registering for push notifications\r\n[AppCenterPush] INFO: -[MSPush applyEnabledState:]/195 Push service has been enabled.\r\n[AppCenterPush] VERBOSE: -[MSPush startWithChannelGroup:appSecret:transmissionTargetToken:fromApplication:]/145 Started push service.\r\n[AppCenterCrashes] VERBOSE: -[MSCrashes channel:didPrepareLog:internalId:flags:]/440 Storing a log to Crashes Buffer: (sid: (null), type: startService)\r\n[AppCenterCrashes] VERBOSE: -[MSCrashes channel:didPrepareLog:internalId:flags:]/451 Found an empty buffer position.\r\n[AppCenter] DEBUG: -[MSChannelUnitDefault enqueueItem:flags:]_block_invoke/187 Saving log, type: startService, flags: 1.\r\n[AppCenter] VERBOSE: -[MSLogDBStorage saveLog:withGroupId:flags:]_block_invoke/120 Log is stored with id: '52'\r\n[AppCenterCrashes] VERBOSE: -[MSCrashes channel:didCompleteEnqueueingLog:internalId:]/493 Deleting a log from buffer with id 1D7F0A2D-857F-4DBC-BB7C-4BD04BDDCE6D\r\n[AppCenterPush] VERBOSE: -[MSPush registerForRemoteNotifications]_block_invoke_2/229 Push notifications authorization was granted.\r\n[AppCenterPush] WARNING: -[MSPush didFailToRegisterForRemoteNotificationsWithError:]/281 Registering for push notifications has been finished with error: remote notifications are not supported in the simulator\r\n[AppCenter] VERBOSE: -[MSHttpIngestion sendCallAsync:]_block_invoke/256 HTTP response received with status code: 304, payload:\r\n(null)\r\n[AppCenterAuth] INFO: -[MSAuth downloadConfigurationWithETag:]_block_invoke/320 Auth config hasn't changed.\r\n[AppCenter] INFO: -[MSHttpIngestion pause]/163 Pause ingestion.\r\n[AppCenter] INFO: -[MSHttpIngestion call:completedWithResult:]/305 Removed call id:A8F61ADF-89F6-434E-B528-6F80EC3FED1A from pending calls:{\r\n}\r\n[AppCenter] VERBOSE: -[MSLogDBStorage loadLogsWithGroupId:limit:excludedTargetKeys:afterDate:beforeDate:completionHandler:]/212 Load log(s) with id(s) '51' as batch Id:0D4A7EAD-9976-461B-BB80-EC817BFFA33B\r\n[AppCenter] DEBUG: -[MSChannelUnitDefault sendLogContainer:withAuthTokenFromArray:atIndex:]/224 Sending 1/1 log, group Id: Analytics, batch Id: 0D4A7EAD-9976-461B-BB80-EC817BFFA33B, session Id: BE8C4F4A-87D6-4FAE-AEF1-C60E16BCC38B, payload:\r\n{\r\n \"sid\" : \"BE8C4F4A-87D6-4FAE-AEF1-C60E16BCC38B\",\r\n \"timestamp\" : \"2019-08-10T22:19:36.902Z\",\r\n \"device\" : {\r\n \"appVersion\" : \"1.0\",\r\n \"appBuild\" : \"1.0\",\r\n \"osName\" : \"iOS\",\r\n \"timeZoneOffset\" : 60,\r\n \"wrapperSdkVersion\" : \"2.1.1\",\r\n \"osVersion\" : \"12.2\",\r\n \"wrapperRuntimeVersion\" : \"11.14.0\",\r\n \"locale\" : \"en_US\",\r\n \"wrapperSdkName\" : \"appcenter.xamarin\",\r\n \"appNamespace\" : \"xxx.xxxxxxxxxx.xxxxxx\",\r\n \"osBuild\" : \"18E226\",\r\n \"sdkName\" : \"appcenter.ios\",\r\n \"oemName\" : \"Apple\",\r\n \"sdkVersion\" : \"2.1.0\",\r\n \"model\" : \"x86_64\",\r\n \"screenSize\" : \"1334x750\"\r\n },\r\n \"type\" : \"startSession\"\r\n}\r\n[AppCenter] VERBOSE: -[MSAppCenterIngestion createRequest:eTag:authToken:]/91 URL: https://in.appcenter.ms/logs?api-version=1.0.0\r\n[AppCenter] VERBOSE: -[MSAppCenterIngestion createRequest:eTag:authToken:]/92 Headers: Install-ID = 79663819-BEC8-4565-B365-8EF1309BC72B, App-Secret = ****************************bdab465a, Content-Type = application/json\r\n[AppCenter] VERBOSE: -[MSLogDBStorage loadLogsWithGroupId:limit:excludedTargetKeys:afterDate:beforeDate:completionHandler:]/212 Load log(s) with id(s) '52' as batch Id:014F696E-7F87-4E3A-9C7A-AE125896E140\r\n[AppCenter] DEBUG: -[MSChannelUnitDefault sendLogContainer:withAuthTokenFromArray:atIndex:]/224 Sending 1/1 log, group Id: AppCenter, batch Id: 014F696E-7F87-4E3A-9C7A-AE125896E140, session Id: (null), payload:\r\n{\r\n \"device\" : {\r\n \"appVersion\" : \"1.0\",\r\n \"appBuild\" : \"1.0\",\r\n \"osName\" : \"iOS\",\r\n \"timeZoneOffset\" : 60,\r\n \"wrapperSdkVersion\" : \"2.1.1\",\r\n \"osVersion\" : \"12.2\",\r\n \"wrapperRuntimeVersion\" : \"11.14.0\",\r\n \"locale\" : \"en_US\",\r\n \"wrapperSdkName\" : \"appcenter.xamarin\",\r\n \"appNamespace\" : \"xxx.xxxxxxxxxx.xxxxxx\",\r\n \"osBuild\" : \"18E226\",\r\n \"sdkName\" : \"appcenter.ios\",\r\n \"oemName\" : \"Apple\",\r\n \"sdkVersion\" : \"2.1.0\",\r\n \"model\" : \"x86_64\",\r\n \"screenSize\" : \"1334x750\"\r\n },\r\n \"timestamp\" : \"2019-08-10T22:19:36.921Z\",\r\n \"type\" : \"startService\",\r\n \"services\" : [\r\n \"Crashes\",\r\n \"Analytics\",\r\n \"Auth\",\r\n \"Data\",\r\n \"Push\"\r\n ]\r\n}\r\n[AppCenter] VERBOSE: -[MSAppCenterIngestion createRequest:eTag:authToken:]/91 URL: https://in.appcenter.ms/logs?api-version=1.0.0\r\n[AppCenter] VERBOSE: -[MSAppCenterIngestion createRequest:eTag:authToken:]/92 Headers: Install-ID = 79663819-BEC8-4565-B365-8EF1309BC72B, App-Secret = ****************************bdab465a, Content-Type = application/json\r\n[AppCenter] VERBOSE: -[MSHttpIngestion sendCallAsync:]_block_invoke/256 HTTP response received with status code: 200, payload:\r\nCorrelationId: 4ef84ce1-ee2a-406e-89ce-33cd3031aca7 ReasonCode: Success\r\n[AppCenter] DEBUG: -[MSChannelUnitDefault sendLogContainer:withAuthTokenFromArray:atIndex:]_block_invoke/247 Log(s) sent with success, batch Id:0D4A7EAD-9976-461B-BB80-EC817BFFA33B.\r\n[AppCenter] INFO: -[MSHttpIngestion call:completedWithResult:]/305 Removed call id:0D4A7EAD-9976-461B-BB80-EC817BFFA33B from pending calls:{\r\n \"014F696E-7F87-4E3A-9C7A-AE125896E140\" = \"<MSIngestionCall: 0x60000229f240>\";\r\n}\r\n[AppCenter] VERBOSE: +[MSLogDBStorage deleteLogsFromDBWithColumnValues:columnName:inOpenedDatabase:]/353 Deletion of log(s) by id with value(s) '51' succeeded.\r\n[AppCenter] VERBOSE: -[MSHttpIngestion sendCallAsync:]_block_invoke/256 HTTP response received with status code: 200, payload:\r\nCorrelationId: 9c8790cf-fbfa-41ee-8882-f7eb79029586 ReasonCode: Success\r\n[AppCenter] INFO: -[MSHttpIngestion call:completedWithResult:]/305 Removed call id:014F696E-7F87-4E3A-9C7A-AE125896E140 from pending calls:{\r\n}\r\n[AppCenter] DEBUG: -[MSChannelUnitDefault sendLogContainer:withAuthTokenFromArray:atIndex:]_block_invoke/247 Log(s) sent with success, batch Id:014F696E-7F87-4E3A-9C7A-AE125896E140.\r\n[AppCenter] VERBOSE: +[MSLogDBStorage deleteLogsFromDBWithColumnValues:columnName:inOpenedDatabase:]/353 Deletion of log(s) by id with value(s) '52' succeeded.\r\n[AppCenter] WARNING: -[MSAuthTokenContext setAuthTokenHistory:]/225 Failed to save new history state.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenterData] WARNING: +[MSTokenExchange removeAllCachedTokens]/206 Failed to remove all of the tokens from keychain\r\n[AppCenter] VERBOSE: -[MSDBStorage dropDatabase]/112 Database file:///Users/dave/Library/Developer/CoreSimulator/Devices/8A749D1A-DBD4-45B5-9161-2C222D36EB4A/data/Containers/Data/Application/4574E442-2C84-466B-A189-CCA1D1768C21/Library/Application%20Support/com.microsoft.appcenter/Documents.sqlite has been deleted.\r\n[AppCenter] VERBOSE: -[MSDBStorage createTable:columnsSchema:uniqueColumnsConstraint:]_block_invoke/136 Table appDocuments has been created\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenterAuth] ERROR: -[MSAuth acquireTokenInteractivelyWithKeyPathForCompletionHandler:]_block_invoke/494 User sign-in failed. Error: Error Domain=MSALErrorDomain Code=-50000 \"(null)\" UserInfo={MSALErrorDescriptionKey=Nil id_token passed, MSALInternalErrorCodeKey=-42600}\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.\r\n[AppCenter] INFO: -[MSChannelUnitDefault authTokenContext:didUpdateAuthToken:]_block_invoke/105 New auth token received, flushing queue.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Url to load test in cerberus queue ?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Some features are not working after install","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Unable to load spacy model linked by a path","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation: There is no grammar definition","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Add support for Fn::Transform\n\nhttps://docs.aws.amazon.com/AWSCloudFormation/latest/UserGuide/intrinsic-function-reference-transform.html","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[Question] Smooth scrolling, when clicking an anchor link","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"API's submissions pagination","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"\"How It Works\" link","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# CVE-2018-19826 (Medium) detected in node-sass-v4.11.0\n\n## CVE-2018-19826 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sassv4.11.0</b></p></summary>\n<p>\n\n<p>:rainbow: Node.js bindings to libsass</p>\n<p>Library home page: <a href=https://github.com/sass/node-sass.git>https://github.com/sass/node-sass.git</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (4)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/binding.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.cpp\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \n** DISPUTED ** In inspect.cpp in LibSass 3.5.5, a high memory footprint caused by an endless loop (containing a Sass::Inspect::operator()(Sass::String_Quoted*) stack frame) may cause a Denial of Service via crafted sass input files with stray '&' or '/' characters. NOTE: Upstream comments indicate this issue is closed as \"won't fix\" and \"works as intended\" by design.\n\n<p>Publish Date: 2018-12-03\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19826>CVE-2018-19826</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[CoreShop2] Front-End site exemple template in PHP.","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Familias2ped - `please contact MDV\n\nBelow, based on first example in the documentation of \r\n`Familias2ped`, I am adviced to contact MDV and I hereby comply: \r\n```` r\r\nlibrary(Familias, quietly = TRUE)\r\nlibrary(forrel)\r\n#> Loading required package: pedtools\r\ndata(NorwegianFrequencies)\r\nTH01 = NorwegianFrequencies$TH01\r\nlocus1 = FamiliasLocus(TH01)\r\npersons = c('mother', 'daughter', 'AF')\r\nped1 = FamiliasPedigree(id = persons,\r\n dadid = c(NA, 'AF', NA),\r\n momid = c(NA, 'mother', NA),\r\n sex = c('female', 'female', 'male'))\r\ndatamatrix = data.frame(THO1.1=c(NA, 8, NA), THO1.2=c(NA,9.3, NA))\r\nrownames(datamatrix) = persons\r\nx = Familias2ped(ped1, datamatrix, locus1)\r\n#> Warning in as.ped.data.frame(p, locus_annotations = annotations): Argument\r\n#> `locus_annotations` is deprecated; use `locusAttributes` instead\r\n#> Error: Genotype columns are sorted differently from `locusAttributes`. Please contact MDV\r\n```\r\n\r\n<sup>Created on 2019-08-11 by the [reprex package](https://reprex.tidyverse.org) (v0.3.0)</sup>\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Please add me - Grass Valley, CA","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Design Review 2017-08-23","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"bad output in sha1","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"DOC: add documentation","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# SDR_Mechanics.esp\n\n<https://www.nexusmods.com/oblivion/mods/37385>\n\nnews as of 7/4/2018\nOfficial Series 9 release is finally available!\nPlease note that the current version 9.010 is (mostly) bug free and all the features should be working. There are still major performance issues if you have a heavy load order with lots of graphics and high population density in exterior locations. You can expect up to a 20 FPS drop. Interior locations don't seem to have any issues.\n\nI also want the performance to be better for myself, so as much as it saddens me, I have decided to do another pass in the future that will significantly cut back on features and the complexity of the revised detection formula. However, I don't have a lot of free time, so I don't have a timeline for when that will happen.\n\nIn the meantime, I hope you will give the current version a shot, and let me know what works and what doesn't. If there are specific features you don't want, need, or are relatively unimportant, let me know and I will consider your suggestions when I revisit this.\n\nThe website has been completely overhauled. And I *think* all the documentation is finally up-to-date. If you have any questions or find any errors let me know in the forums or via private message. Check it out here: SDR Website\n\nShadow Hide You...\n\n-saebel\n=====================================================\nSneaking Detection Recalibrated (SDR)\nVoted FILE OF THE MONTH on the Oblivion Nexus for April, 2012\nCategory: Gameplay Effects and Changes (Stealth/Detection/Magic/Combat)\n\nThis mod re-calibrates the system for detection, sneaking, and nearly everything related to it. And by \"re-calibrate\", I mean a complete page one rewrite of the whole freaking system\n\nFeatures includes:\n- A completely rewritten detection formula that replaces the default Oblivion formula, built into an efficient OBSE plug-in.\n- Immersive features, such as transparent sneaking NPCs, dynamic night-eye shaders, alternate detect life shaders, and new perks.\n- New NPC AI detection behavior changes including using detection magic (night-eye/detect life) and a custom version of SM Combat Hide built in.\n- New spells/effects for blindness, deafness, muffling.\n- Revamped sprinting / movement / fatigue options.\n- Alternative ways to \"skill-up\" when sneaking, including points for undetected clean kills.\n- Revised rules on how invisibility works when it comes to detection, torches, and light spells.\nAll of these features and much, much more...\n\nMajor benefits compared to other stealth overhauls:\n- SDR does not modify any skills, making it compatible with all leveling mods, including Oblivion XP.\n- Works out of the box with the default settings, but can be customized with various .ini files.\n- Extensively documented on SDR's Official Website, which you can also download locally as a single HTML document.\n\n===========================================================================================\nAbout Performance Improvements\n=========================================================================================== \nSeries 9 has been rewritten extensively to try and improve performance. However, in combination with mods that add lots of content, especially high population densities in exterior areas, there will be a very noticeable performance hit. More information about performance and how to improve it is available on the SDR website.\n\n===========================================================================================\nRequired / Recommended / Optional files\n=========================================================================================== \nPlease see the required files link above for a full list. Additional details, along with installation instructions, credits, development plans, contact/legal permissions, etc. are all on SDR's Official Website\ufeff.\n\nIMPORTANT:\nIf you are updating from any version prior to version 9.000, you should completely uninstall the old version, do a clean save without SDR loaded, and then install the new version of SDR. Otherwise bad things happen. Complete installation, uninstalling, and version change log information are available on the SDR website.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"load_latlngJSON documentation not listed on website","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Header partial transparancy not functioning correctly\n\nHeader partial transparancy not functioning correctly\r\n\r\nWhen creating a main template with the header and footer, creating a partially transparant (so the header has a background color like background-color: rgba(81, 79, 76, 0.5);) header works fine. Untill you add the \"Inner Content\" area and there is already content being pulled in from a page. Then the transprancy is gone.\r\n\r\nThis pulled in content from, let's say the homepage, needs to show a container with a background image with the following settings:\r\n\r\nBackground size: Cover\r\nBackground Attachment (Parralax): Fixed\r\nTop: -200px\r\n\r\nContainer padding: top 21% / bottom 12% (so it has some height)\r\nSection Container Width: page-width\r\n\r\n**Have you tried all the steps at https://oxygenbuilder.com/documentation/troubleshooting/troubleshooting-guide/?**\r\n*Yes*\r\n\r\n**Are you able to replicate the issue on a Sandbox install at https://oxygenbuilder.com/try?**\r\n*If yes, provide the link to a Sandbox install where the issue is present. If the issue only exists on a specific post or template, provide a direct link to that post or template.*\r\n\r\nhttp://affectionate-eel.w6.wpsandbox.pro/\r\n\r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# App package env var default version\n\nCan we add a description for the env var which is available when using the default version of an application package (without \"#version\" postfix) \n\n---\n#### Document Details\n\n\u26a0 *Do not edit this section. It is required for docs.microsoft.com \u279f GitHub issue linking.*\n\n* ID: fe76a04d-fca3-b64a-9fdf-b12de26d3d19\n* Version Independent ID: 9dbb15f3-8665-595d-2322-fb8cdfc0c6e3\n* Content: [Task runtime environment variables - Azure Batch](https://docs.microsoft.com/en-us/azure/batch/batch-compute-node-environment-variables#feedback)\n* Content Source: [articles/batch/batch-compute-node-environment-variables.md](https://github.com/Microsoft/azure-docs/blob/master/articles/batch/batch-compute-node-environment-variables.md)\n* Service: **batch**\n* GitHub Login: @laurenhughes\n* Microsoft Alias: **lahugh**","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Peridot 2.0 Ideas\n\nAfter a while of thinking about what a peridot 2.0 should look like, I don't think i want to try and compete with Kahlan and PHPUnit. Peridot 1.0 is basically a lite version of kahlan with a smaller community and resources behind it. I don't really see any area that peridot can try to tap into that Kahlan userbase in a way that Kahlan isn't already doing.\r\n\r\nThe describe-it syntax is very popular in other languages and personally, I do like how expressive the system can be. That being said however, PHPUnit by far is the defacto testing framework and majority of the PHP community resources and integrations revolve around PHPUnit. If kahlan hasn't taken off by now, then it's very likely that the php community won't be ready to rally around a new framework.\r\n\r\nPersonally, I actually like that the PHP community largely pools our resources into one testing framework. It's very attractive for new comers and also means that the community can focus our efforts on growing the PHPUnit ecosystem instead of dilluting our efforts among competing frameworks.\r\n\r\nAll that said, I do think there might a space for a new frontend for PHPUnit. A library that piggy backs off the entire phpunit framework but provides a different interface for writing tests with the describe-it syntax that the peridot/kahlan community enjoys.\r\n\r\nI think peridot 2.0 would be a good place to explore this integration with phpunit. It'd allow us to focus on the parts of testing framework that are interesting and leave the heavy lifting to phpunit all the while piggy backing off of the wealth of documentation and third party plugins that phpunit provides.\r\n\r\nIn a similar manner, we'd probably make peridot/leo be just a thin wrapper around the PHPUnit assertions library and include in the peridot 2.0 core.\r\n\r\nIn the long run, I'd love to see peridot's new phpunit frontend be brought under the fold of PHPUnit with first class support and apart of the phpunit ecosystem instead of a separate library.\r\n\r\nLet me know what you all think about this idea, welcome to suggestions :)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2018-19839 (Medium) detected in CSS::Sass-v3.6.0\n\n## CVE-2018-19839 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>CSS::Sassv3.6.0</b></p></summary>\n<p>\n\n<p>Library home page: <a href=https://metacpan.org/pod/CSS::Sass>https://metacpan.org/pod/CSS::Sass</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (63)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/unchecked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/output.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/cencode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/base.h\n - /website/docs/node_modules/node-sass/src/libsass/src/json.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/position.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp\n - /website/docs/node_modules/node-sass/src/libsass/contrib/plugin.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/core.h\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/functions.h\n - /website/docs/node_modules/node-sass/src/libsass/test/test_superselector.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8_string.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/node.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cencode.c\n - /website/docs/node_modules/node-sass/src/libsass/src/subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/base64vlq.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/c99func.c\n - /website/docs/node_modules/node-sass/src/libsass/src/position.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/values.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_subset_map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass2scss.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/paths.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass/context.h\n - /website/docs/node_modules/node-sass/src/libsass/src/color_maps.hpp\n - /website/docs/node_modules/node-sass/src/libsass/test/test_unification.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_util.cpp\n - /website/docs/node_modules/node-sass/src/libsass/script/test-leaks.pl\n - /website/docs/node_modules/node-sass/src/libsass/src/source_map.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/lexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/json.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/units.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/b64/encode.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/environment.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/utf8/checked.h\n - /website/docs/node_modules/node-sass/src/libsass/src/plugins.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/listize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debug.hpp\n - /website/docs/node_modules/node-sass/src/libsass/include/sass2scss.h\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nIn LibSass prior to 3.5.5, the function handle_error in sass_context.cpp allows attackers to cause a denial-of-service resulting from a heap-based buffer over-read via a crafted sass file.\n\n<p>Publish Date: 2018-12-04\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19839>CVE-2018-19839</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19839\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19839</a></p>\n\n<p>Fix Resolution: 3.5.5</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# vault-guides/identity/oidc-auth: What's an JWT token and how to get it?\n\nHey guys,\r\n\r\nRegarding: https://github.com/hashicorp/vault-guides/tree/master/identity/oidc-auth\r\n\r\nThe guide lacks to explain how and where I should get a JWT token from. Also after trying to replicate the situation in this guide, I'm stuck on the error message: \"Authentication failed: role with oidc role_type is not allowed\" when trying to login.\r\n\r\nWhat am I missing? This might also be assumed basic knowledge, but it would be helpful if the guide (also at vault's own documentation) would point to that kind of basic knowledge.\r\n\r\nCan't wait to hear from you! :-)\r\n\r\n_Edit: My goal is to translate this knowledge to Gitlab OpenID authentication but it's really a lot to get my head around._","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Documentation is W\n\nhttps://github.com/structurizr/dotnet/issues","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Task: DotNetCoreCLI@2 nuget pack unexpectedly issuing multiple commands\n\n## Required Information\r\n**Question, Bug, or Feature?** \r\n*Type*: Bug\r\n\r\n**Enter Task Name**: DotNetCoreCli@2\r\n\r\nlist here (V# not needed): \r\nhttps://github.com/microsoft/azure-pipelines-tasks/tree/master/Tasks/DotNetCoreCLIV2\r\n\r\n## Environment\r\n- Server - Azure Pipelines or TFS on-premises? Azure Pipelines\r\n \r\n - If using Azure Pipelines, provide the account name, team project name, build definition name/build number: \r\n\r\n> Account name: innovian\r\n> Team project name: Platform\r\n> Build definition name: Utilities.ResourceNameBuilder\r\n> Build number: 20190810.5\r\n\r\n\r\n- Agent - Hosted or Private: Hosted\r\n\r\nAgent queue name: Hosted Agent\r\n\r\n\r\n## Issue Description\r\nTask display name: \"Nuget package pack\"\r\n\r\nHere is the YAML I'm using:\r\n```\r\n- task: DotNetCoreCLI@2\r\n displayName: \"Nuget package pack\"\r\n inputs:\r\n command: 'pack'\r\n packagesToPack: '**/Utilities.ResourceNameBuilder.csproj'\r\n includesymbols: true\r\n versioningScheme: 'byPrereleaseNumber'\r\n majorVersion: '1'\r\n minorVersion: '0'\r\n patchVersion: '0'\r\n\r\n- task: DotNetCoreCLI@2\r\n displayName: 'Nuget package push'\r\n inputs:\r\n command: 'push'\r\n packagesToPush: '$(Build.ArtifactStagingDirectory)/*.nupkg'\r\n nuGetFeedType: 'internal'\r\n publishVstsFeed: 'xyz'\r\n```\r\n\r\nI'm aiming to include the symbols and only publish the one project (specifically stated since I also have a unit test project in the solution).\r\n\r\nI'm showing that there are two pushes performed: one for the packed target and one for the symbols. If I run this via the NuGetCommand@2, my YAML looks like the following:\r\n\r\n```\r\n- task: NuGetCommand@2\r\n inputs:\r\n command: 'pack'\r\n packagesToPack: '**/Utilities.ResourceNameBuilder.csproj'\r\n versioningScheme: 'byPrereleaseNumber'\r\n majorVersion: '1'\r\n minorVersion: '0'\r\n patchVersion: '0'\r\n includeSymbols: true\r\n```\r\n\r\nAnd it includes only the single push as I'd expect and runs without an error. Running as-is with the .NET Core CLI yields an error as seen in the logs below.\r\n\r\n### Task logs\r\n\r\n> [section]Starting: Nuget package push\r\n> ==============================================================================\r\n> Task : .NET Core\r\n> Description : Build, test, package, or publish a dotnet application, or run a custom dotnet command\r\n> Version : 2.155.0\r\n> Author : Microsoft Corporation\r\n> Help : https://docs.microsoft.com/azure/devops/pipelines/tasks/build/dotnet-core-cli\r\n> ==============================================================================\r\n> [command]C:\\windows\\system32\\chcp.com 65001\r\n> Active code page: 65001\r\n> SYSTEMVSSCONNECTION exists true\r\n> SYSTEMVSSCONNECTION exists true\r\n> SYSTEMVSSCONNECTION exists true\r\n> Saving NuGet.config to a temporary config file.\r\n> Saving NuGet.config to a temporary config file.\r\n> [command]\"C:\\Program Files\\dotnet\\dotnet.exe\" nuget push d:\\a\\1\\a\\Utilities.ResourceNameBuilder.1.0.0-CI-20190810-234741.nupkg --source https://pkgs.dev.azure.com/innovian/_packaging/xyz/nuget/v3/index.json --api-key VSTS\r\n> info : Pushing Utilities.ResourceNameBuilder.1.0.0-CI-20190810-234741.nupkg to 'https://pkgs.dev.azure.com/innovian/_packaging/xyz/nuget/v2/'...\r\n> info : PUT https://pkgs.dev.azure.com/innovian/_packaging/xyz/nuget/v2/\r\n> info : Accepted https://pkgs.dev.azure.com/innovian/_packaging/xyz/nuget/v2/ 1049ms\r\n> info : Your package was pushed.\r\n> [command]\"C:\\Program Files\\dotnet\\dotnet.exe\" nuget push d:\\a\\1\\a\\Utilities.ResourceNameBuilder.1.0.0-CI-20190810-234741.symbols.nupkg --source https://pkgs.dev.azure.com/innovian/_packaging/xyz/nuget/v3/index.json --api-key VSTS\r\n> info : Pushing Utilities.ResourceNameBuilder.1.0.0-CI-20190810-234741.symbols.nupkg to 'https://pkgs.dev.azure.com/innovian/_packaging/xyz/nuget/v2/'...\r\n> info : PUT https://pkgs.dev.azure.com/innovian/_packaging/xyz/nuget/v2/\r\n> info : Conflict https://pkgs.dev.azure.com/innovian/_packaging/xyz/nuget/v2/ 219ms\r\n> error: Response status code does not indicate success: 409 (Conflict - The feed already contains 'Utilities.ResourceNameBuilder 1.0.0-CI-20190810-234741'. (DevOps Activity ID: A5D45BB8-133C-4368-B2B8-B7B68D04DF0E)).\r\n> [error]Error: The process 'C:\\Program Files\\dotnet\\dotnet.exe' failed with exit code 1\r\n> [error]Packages failed to publish\r\n> [section]Finishing: Nuget package push\r\n\r\nRemoved hashes at line starts to keep them from being interpreted as Github markup.\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Add documentation explaining how this template project works\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"cmd/compile: bad position information for inlined functions","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Custom fonts error\n\n#Description\r\n\r\nwhen trying to add custom font like the documentation says [here](https://callstack.github.io/react-native-paper/fonts.html) \r\n\r\ni get this error\r\n\r\n```\r\nWarning: Failed prop type: Invalid props.style key `0` supplied to `Text`.\r\nBad object: {\r\n \"0\": \"R\",\r\n \"1\": \"o\",\r\n \"2\": \"b\",\r\n \"3\": \"o\",\r\n \"4\": \"t\",\r\n \"5\": \"o\",\r\n \"color\": \"white\",\r\n \"textAlign\": \"left\",\r\n \"writingDirection\": \"rtl\",\r\n \"fontSize\": 24,\r\n \"lineHeight\": 32,\r\n \"marginVertical\": 2,\r\n \"letterSpacing\": 0,\r\n \"paddingVertical\": 5\r\n}\r\n```\r\n\r\n## Reproducible Demo\r\n\r\nmy code as the following \r\n\r\n```\r\nconst themeDefault = {\r\n ...DefaultTheme,\r\n colors: {\r\n ...DefaultTheme.colors,\r\n primary: '#FFFFFF',\r\n accent: '#F1F1F1',\r\n },\r\n fonts: {\r\n regular: 'Roboto',\r\n medium: 'Roboto',\r\n light: 'Roboto Light',\r\n thin: 'Roboto Thin',\r\n },\r\n};\r\n\r\nconst themeDark = {\r\n ...DarkTheme,\r\n fonts: {\r\n regular: 'Roboto',\r\n medium: 'Roboto',\r\n light: 'Roboto Light',\r\n thin: 'Roboto Thin',\r\n },\r\n}\r\n```\r\n\r\n\r\n## Environment\r\n```\r\n\"react\": \"16.8.3\",\r\n\"expo\": \"^33.0.5\",\r\n\"react-native\": \"https://github.com/expo/react-native/archive/sdk-33.0.0.tar.gz\",\r\n \"@babel/plugin-proposal-class-properties\": \"^7.3.0\",\r\n \"@babel/plugin-proposal-object-rest-spread\": \"^7.3.1\",\r\n \"@babel/preset-env\": \"^7.3.1\",\r\n \"@babel/preset-flow\": \"^7.0.0\",\r\n \"@babel/preset-react\": \"^7.0.0\",\r\n \"@babel/preset-typescript\": \"^7.3.3\",\r\n \"@types/react-navigation\": \"^2.13.10\",\r\n \"babel-loader\": \"^8.0.5\",\r\n \"babel-plugin-module-resolver\": \"^3.1.1\",\r\n \"babel-preset-expo\": \"^5.0.0\",\r\n \"css-loader\": \"^3.0.0\",\r\n \"expo-cli\": \"^2.3.8\",\r\n \"style-loader\": \"^0.23.1\",\r\n \"webpack\": \"^4.29.0\",\r\n \"webpack-cli\": \"^3.2.1\",\r\n \"webpack-dev-server\": \"^3.1.14\"\r\n\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# getMany returns single result after adding foreign key column for find methods\n\n**Issue type:**\r\n\r\n[ ] question\r\n[X] bug report\r\n[ ] feature request\r\n[ ] documentation issue\r\n\r\n**Database system/driver:**\r\n\r\n[ ] `cordova`\r\n[ ] `mongodb`\r\n[ ] `mssql`\r\n[ ] `mysql` / `mariadb`\r\n[ ] `oracle`\r\n[X] `postgres`\r\n[ ] `cockroachdb`\r\n[ ] `sqlite`\r\n[ ] `sqljs`\r\n[ ] `react-native`\r\n[ ] `expo`\r\n\r\n**TypeORM version:**\r\n\r\n[X] `latest`\r\n[ ] `@next`\r\n[ ] `0.x.x` (or put your version here)\r\n\r\n**Steps to reproduce or a small repository showing the problem:**\r\n\r\nI first had an issue trying to query something using `find` and not having the full foreign entity but only its id, so as explained https://github.com/typeorm/typeorm/issues/3288 and https://github.com/typeorm/typeorm/issues/2163 I've added a column to be able to use find with the foreign key\r\n\r\n```ts\r\n @Index('userorganization-user-idx')\r\n @ManyToOne(type => User, user => user.userOrganizations, { primary: true, eager: true })\r\n user: User;\r\n\r\n @Index('userorganization-organization-idx')\r\n @ManyToOne(type => Organization, organization => organization.userOrganizations, { primary: true, eager: true })\r\n organization: Organization;\r\n\r\n @Column()\r\n organizationCustomerNumber: string;\r\n```\r\n\r\nas you can see I've added the `organizationCustomerNumber` column matching the one in the second `ManyToOne`.\r\n\r\nThe problem is that now this query:\r\n\r\n```ts\r\nconst userOrganizations = await this.userOrganizationRepository\r\n .createQueryBuilder('userOrganization')\r\n .innerJoinAndSelect('userOrganization.organization', 'organization')\r\n .innerJoinAndSelect('userOrganization.role', 'role')\r\n .innerJoin('userOrganization.user', 'user')\r\n .where('user.id = :id', { id: user.id })\r\n .getMany();\r\n```\r\n\r\nreturns only the first row, removing the foreign key column fixes the problem","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# typescriptlang docs/home.html doesn't work\n\nIn China, The docs/home.html 404.\r\n\r\n\r\n\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"headless service with no port definition finds no endpoint","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2019-6283 (Medium) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2019-6283 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>\n<p>\n\n<p>A Java based fault and performance management system</p>\n<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nIn LibSass 3.5.5, a heap-based buffer over-read exists in Sass::Prelexer::parenthese_scope in prelexer.hpp.\n\n<p>Publish Date: 2019-01-14\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6283>CVE-2019-6283</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-6284</a></p>\n<p>Release Date: 2019-08-06</p>\n<p>Fix Resolution: 3.6.0</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update Firewall Topics","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"RFE: Develop an example that integrates the search box","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# DGLGraph.edge_ids return invalid number edges?\n\n## \ud83d\udc1b Bug\r\n\r\nThis is a very specific scenario, where the edge_ids seem to be returning an invalid tuple. \r\n\r\n## To Reproduce\r\n\r\nI have a graph with 75377 nodes and 285854 edges. When I am examining a specific node, these are the values I get,\r\n\r\n```python\r\n>>> graph.in_degree(60320) \r\n1\r\n\r\n>>> graph.out_degree(60320)\r\n1\r\n\r\n>>> graph.in_edges(60320)\r\n(tensor([15610]), tensor([60320]))\r\n\r\n>>> graph.out_edges(60320)\r\n(tensor([60320]), tensor([15610]))\r\n\r\n>>> graph.edge_ids(60320, 15610)\r\n(tensor([60320]), tensor([15610]), tensor([99007]))\r\n```\r\n\r\n## Expected behavior\r\n\r\nFrom the first 4 commands, and based on the documentation on [edge_ids](https://docs.dgl.ai/generated/dgl.DGLGraph.edge_ids.html#dgl.DGLGraph.edge_ids), I expect `edge_ids(60320, 15610)` to return something like this,\r\n\r\n```python\r\n(tensor([60320, 15610]), tensor([15610, 60320]), tensor([99007, different_edge_id]))\r\n```\r\n\r\n## Environment\r\n\r\n - DGL Version (e.g., 1.0): **0.3**\r\n - Backend Library & Version (e.g., PyTorch 0.4.1, MXNet/Gluon 1.3): **PyTorch 1.1.0**\r\n - OS (e.g., Linux): **MacOS**\r\n - How you installed DGL (`conda`, `pip`, source): **conda**\r\n - Build command you used (if compiling from source): **N/A**\r\n - Python version: **3.7**\r\n - CUDA/cuDNN version (if applicable): **N/A**\r\n - GPU models and configuration (e.g. V100): **N/A**\r\n - Any other relevant information: **N/A**\r\n\r\n## Additional context\r\n\r\n**N/A**","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation should link to design spec","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# [firebase_crashlytics] Gradle build failed\n\nI am trying to integrate firebase crashlytics into my Android project following the [docs](https://pub.dev/packages/firebase_crashlytics#-readme-tab-) and end up with gradle build falilure:\r\n```\r\nFAILURE: Build failed with an exception.\r\n\r\n* Where:\r\nBuild file 'F:\\Flutter\\Projects\\Alola\\android\\app\\build.gradle' line: 24\r\n\r\n* What went wrong:\r\nA problem occurred evaluating project ':app'.\r\n> ASCII\r\n```\r\n\r\nAnd after some struggling I was able to find the cause for it is because of the google service's version specified in the doc . After I downgrade google service version from `4.3.0` to `4.2.0` then the error is gone.\r\n\r\nMy Flutter doctor output:\r\n```[\u221a] Flutter (Channel beta, v1.7.8+hotfix.4, on Microsoft Windows [Version 10.0.17134.885], locale vi-VN)\r\n[\u221a] Android toolchain - develop for Android devices (Android SDK version 29.0.2)\r\n[\u221a] Android Studio (version 3.4)\r\n[!] Android Studio (version 3.2)\r\n X Flutter plugin not installed; this adds Flutter specific functionality.\r\n X Dart plugin not installed; this adds Dart specific functionality.\r\n[!] IntelliJ IDEA Community Edition (version 2018.2)\r\n X Flutter plugin not installed; this adds Flutter specific functionality.\r\n X Dart plugin not installed; this adds Dart specific functionality.\r\n[\u221a] VS Code, 64-bit edition (version 1.31.1)\r\n[\u221a] Connected device (2 available)\r\n\r\n! Doctor found issues in 2 categories.\r\n\r\nF:\\Flutter\\Projects\\GameDB>flutter doctor -v\r\n[\u221a] Flutter (Channel beta, v1.7.8+hotfix.4, on Microsoft Windows [Version 10.0.17134.885], locale vi-VN)\r\n \u2022 Flutter version 1.7.8+hotfix.4 at F:\\Flutter\\flutter_windows_v0.7.3-beta\\flutter\r\n \u2022 Framework revision 20e59316b8 (3 weeks ago), 2019-07-18 20:04:33 -0700\r\n \u2022 Engine revision fee001c93f\r\n \u2022 Dart version 2.4.0\r\n\r\n[\u221a] Android toolchain - develop for Android devices (Android SDK version 29.0.2)\r\n \u2022 Android SDK at F:\\AndroidSDK\r\n \u2022 Android NDK location not configured (optional; useful for native profiling support)\r\n \u2022 Platform android-29, build-tools 29.0.2\r\n \u2022 ANDROID_HOME = F:\\AndroidSDK;F:\\AndroidSDK\\platform-tools\\;\r\n \u2022 Java binary at: F:\\AndroidStudio\\jre\\bin\\java\r\n \u2022 Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1343-b01)\r\n \u2022 All Android licenses accepted.\r\n\r\n[\u221a] Android Studio (version 3.4)\r\n \u2022 Android Studio at F:\\AndroidStudio\r\n \u2022 Flutter plugin version 38.2.1\r\n \u2022 Dart plugin version 183.6270\r\n \u2022 Java version OpenJDK Runtime Environment (build 1.8.0_152-release-1343-b01)\r\n\r\n[\u221a] VS Code, 64-bit edition (version 1.31.1)\r\n \u2022 VS Code at C:\\Program Files\\Microsoft VS Code\r\n \u2022 Flutter extension version 2.23.0\r\n\r\n[\u221a] Connected device (2 available)\r\n \u2022 D6603 \u2022 BH902VHB1L \u2022 android-arm \u2022 Android 6.0.1 (API 23)\r\n \u2022 Android SDK built for x86 64 \u2022 emulator-5554 \u2022 android-x64 \u2022 Android 8.0.0 (API 26) (emulator)\r\n```","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# REDISCOVER should attempt to make airlocks safe\n\nDocs for the `rediscover` command claim that affected airlocks will be placed into their safest mode once discovery is done. Currently, modes are not even implemented, so obviously it does not.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update kubeconfig docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Review assignments instruction","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Correct documentation on hybrid/branch run (section 8.3) and augment create_clone","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# CVE-2018-20822 (Medium) detected in opennms-opennms-source-23.0.0-1\n\n## CVE-2018-20822 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-23.0.0-1</b></p></summary>\n<p>\n\n<p>A Java based fault and performance management system</p>\n<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n</p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Library Source Files (62)</summary>\n<p></p>\n<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>\n<p>\n\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/expand.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/factory.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/value.h\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.hpp\n - /website/docs/node_modules/node-sass/src/callback_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/file.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operation.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/operators.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.hpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/parser.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/constants.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/util.cpp\n - /website/docs/node_modules/node-sass/src/custom_function_bridge.cpp\n - /website/docs/node_modules/node-sass/src/custom_importer_bridge.h\n - /website/docs/node_modules/node-sass/src/libsass/src/bind.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/backtrace.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/extend.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.h\n - /website/docs/node_modules/node-sass/src/sass_types/sass_value_wrapper.h\n - /website/docs/node_modules/node-sass/src/libsass/src/error_handling.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/debugger.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/emitter.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/number.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.h\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_values.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/output.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/null.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/functions.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/cssize.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_c.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/inspect.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/color.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/values.cpp\n - /website/docs/node_modules/node-sass/src/sass_context_wrapper.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/list.h\n - /website/docs/node_modules/node-sass/src/libsass/src/check_nesting.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/map.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/to_value.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.cpp\n - /website/docs/node_modules/node-sass/src/sass_types/string.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/sass_context.cpp\n - /website/docs/node_modules/node-sass/src/libsass/src/prelexer.hpp\n - /website/docs/node_modules/node-sass/src/libsass/src/context.hpp\n - /website/docs/node_modules/node-sass/src/sass_types/boolean.h\n - /website/docs/node_modules/node-sass/src/libsass/src/eval.cpp\n</p>\n</details>\n<p></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nLibSass 3.5.4 allows attackers to cause a denial-of-service (uncontrolled recursion in Sass::Complex_Selector::perform in ast.hpp and Sass::Inspect::operator in inspect.cpp).\n\n<p>Publish Date: 2019-04-23\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20822>CVE-2018-20822</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: None\n - User Interaction: Required\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: None\n - Availability Impact: High\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20822\">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20822</a></p>\n<p>Release Date: 2019-08-06</p>\n<p>Fix Resolution: 3.6.0</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Intellisense is trying to open file with duplicated path\n\n**Type: LanguageService**\r\n<!----- Input information below ----->\r\n\r\n<!--\r\n**Please review existing issues and our documentation at https://github.com/Microsoft/vscode-cpptools/tree/master/Documentation prior to filing an issue.**\r\n-->\r\n\r\n**Describe the bug**\r\n- OS and Version: Windows 10\r\n- VS Code Version: July 2019 1.37.0\r\n- C/C++ Extension Version: 0.25.0-insiders\r\n- Other extensions you installed (and if the issue persists after disabling them): Green theme 0.0.2, Python(Made by MS) 2019.8.29288 - I disabled all when I encountered bug.\r\n- A clear and concise description of what the bug is: VSCode is opening file in wrong path(duplicated path. VScode is trying to open `A/A` when actual file path is `A`) when I click compilation errors on \"Problems tab\".\r\n\r\n\r\n**To Reproduce**\r\n<!-- Steps to reproduce the behavior: -->\r\n<!-- *The most actionable issue reports include a code sample including configuration files such as c_cpp_properties.json* -->\r\n1. Install G++ 8.2.0, VSCode and C/C++ extension.\r\n2. Make any folder and open with VSCode.\r\n3. This is tasks.json.\r\n```\r\n{\r\n // See https://go.microsoft.com/fwlink/?LinkId=733558 \r\n // for the documentation about the tasks.json format\r\n \"version\": \"2.0.0\",\r\n \"echoCommand\": true,\r\n \"tasks\": [\r\n {\r\n \"type\": \"shell\",\r\n \"label\": \"BUILD\",\r\n \"command\": \"C:\\\\MinGW\\\\bin\\\\g++.exe\",\r\n \"args\": [\r\n \"-g\",\r\n \"${file}\",\r\n \"-o\",\r\n \"${fileDirname}\\\\${fileBasenameNoExtension}.exe\"\r\n ],\r\n /*\"options\": {\r\n \"cwd\": \"C:\\\\MinGW\\\\bin\"\r\n },*/\r\n \"problemMatcher\": [\r\n \"$gcc\"\r\n ],\r\n \"group\": \"build\"\r\n }\r\n ]\r\n}\r\n```\r\n4. This is c_cpp_properties.json.\r\n```\r\n{\r\n \"configurations\": [\r\n {\r\n \"name\": \"CompetitiveProgramming\",\r\n \"includePath\": [\r\n \"${workspaceFolder}/**\"\r\n ],\r\n \"defines\": [\r\n \"_DEBUG\",\r\n \"UNICODE\",\r\n \"_UNICODE\",\r\n \"__McDic__\"\r\n ],\r\n \"compilerPath\": \"C:\\\\MinGW\\\\bin\\\\g++.exe\",\r\n \"intelliSenseMode\": \"gcc-x64\",\r\n \"cStandard\": \"c11\",\r\n \"cppStandard\": \"c++17\"\r\n }\r\n ],\r\n \"version\": 4\r\n}\r\n```\r\n5. This is my cpp code, which will make compilation error. (**Line 10's std::cou1t**)\r\n```\r\n#include <stdio.h>\r\n#include <iostream>\r\n#include <vector>\r\n#include <time.h>\r\n\r\nint main(void){\r\n#ifdef __McDic__\r\n printf(\"McDic is defined.\\n\");\r\n#endif\r\n std::cou1t << \"Hello World \" << time(NULL) << std::endl;\r\n return 0;\r\n}\r\n```\r\n6. Build with Ctrl+Shift+B.\r\n\r\n**Expected behavior**\r\nYou will encounter compilation error. If you click error message in \"PROBLEMS\" tab, you will see VScode is trying to open file with wrong path. For example,\r\n```\r\nUnable to open 'testcpp.cpp': Unable to read file (Error: File not found (c:\\Users\\spong\\Desktop\\vscode_new_testing_workspace\\c:\\Users\\spong\\Desktop\\vscode_new_testing_workspace\\testcpp.cpp)).\r\n```\r\n\r\n**Screenshots**\r\nThis is the messages when I try to click compilation error message to navigate to error line.\r\n\r\n\r\n\r\n\r\n\r\n**Additional context**\r\n<!--\r\n* Call Stacks: For bugs like crashes, deadlocks, infinite loops, etc. that we are not able to repro and for which the call stack may be useful, please attach a debugger and/or create a dmp and provide the call stacks. Windows binaries have symbols available in VS Code by setting your \"symbolSearchPath\" to \"https://msdl.microsoft.com/download/symbols\".\r\n* Add any other context about the problem here including log messages in your Output window (\"C_Cpp.loggingLevel\": \"Debug\" in settings.json).\r\n-->\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Follow Link Scss doesn't work with ~Tilde to search in node module or webpack alias\n\nUsing a **~**Tilde to go directly to a node_module path without having to add aliases to webpack this path doesn't get resolved by VS Code. \r\n\r\nSo with a node_module 'bootstrap' installed, we can import bootstrap sass-files using this construction:\r\n\r\n``` scss\r\n@import '~bootstrap/scss/bootstrap-grid';\r\n@import '~bootstrap/scss/utilities/sizing';\r\n```\r\nWhile this is working fine with Webpack and the sass-loader/node_sass, VS Code doesn't recognise this tilde-construction and searches on a follow-link in our `source-folder\\bootstrap\\` instead of in `node_modules\\bootstrap\\`\r\n\r\nWhen we use aliases in Javascript we solved his resolve problem by adding aliases to the jsconfig.json. But AFAIK there's not such a config file for Sass in VS Code.\r\nI've been searching everywhere, including settings on sass, but couldn't find anything where we could make VS Code respect the node_modules path (and Webpack aliases) with a tilde~.\r\n\r\nSo unfortunately we seem to not being able to click to follow link on these sass imports.\r\n\r\n**versions**\r\n- VSCode Version: 1.37.0\r\n- OS Version: Windows 10 Home Updated\r\n\r\n**Steps to Reproduce:**\r\n1. Have a node modulle, like node_modules/bootstrap, installed\r\n2. Import a sassfile from this module, like `@import '~bootstrap/scss/bootstrap-grid';`\r\n3. Ctrl+click on the import line to try to 'Follow Link'\r\nResult: it shows an error-message popup saying 'Unable to open' with the wrong path\r\n\r\n**extensions**\r\nDoes this issue occur when all extensions are disabled?: Yes\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Multiple ShellComponent instance","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Error when using ibis to insert pandas dataframe","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Script/Tool to update package list doc","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Upgrade to 3.1.0 ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Update --cert-path to --tls-cert-path","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"How to use databaseChangeEvents?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Cookbook updates should trigger file revision creation\n\nSo what I'm trying to do is combine your cookbook app with the group folders app to create a central cookbook that everyone on the nextcloud server can contribute to. Unfortunately if I give everyone write but deny delete there's still the possibility of vandalizing or accidentally messing up a recipe. What would mitigate this is if the file revisions were triggered when changes were made to the cookbook entry through the app. This doesn't seem to be the case right now as I made a change, adding a new instruction to an entry and then went and inspected the .json file for it and observed that it had no revision history, even though a change was made.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Release 0.1.0","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"502 bad gateway on web Server ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Sample course: first session: use consistent phrasing","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# [Request] Publish the Nextflow GroovyDoc\n\nHi Nextflow team,\r\n\r\nI'm experimenting with embedding nextflow in a java application and I tried finding the API documentation ( rather then ther user documentation) for the project. I have searched on the main website and tried other resources but I can't find it. Please also consider publishing the API doc for this wonderful tool, Nextflow.\r\n\r\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Not sure how to use the JBossPolicyRegistrationObjectFactory","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# CVE-2018-3721 (Medium) detected in lodash-1.0.2.tgz\n\n## CVE-2018-3721 - Medium Severity Vulnerability\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-1.0.2.tgz</b></p></summary>\n\n<p>A utility library delivering consistency, customization, performance, and extras.</p>\n<p>Library home page: <a href=\"https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz\">https://registry.npmjs.org/lodash/-/lodash-1.0.2.tgz</a></p>\n<p>Path to dependency file: /website/docs/package.json</p>\n<p>Path to vulnerable library: /tmp/git/website/docs/node_modules/lodash/package.json</p>\n<p>\n\nDependency Hierarchy:\n - gulp-3.9.1.tgz (Root Library)\n - vinyl-fs-0.3.14.tgz\n - glob-watcher-0.0.6.tgz\n - gaze-0.5.2.tgz\n - globule-0.1.0.tgz\n - :x: **lodash-1.0.2.tgz** (Vulnerable Library)\n<p>Found in HEAD commit: <a href=\"https://github.com/mixcore/website/commit/eeefb98d520629c182c4d88691216d2bd738678a\">eeefb98d520629c182c4d88691216d2bd738678a</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>\n<p> \n \nlodash node module before 4.17.5 suffers from a Modification of Assumed-Immutable Data (MAID) vulnerability via defaultsDeep, merge, and mergeWith functions, which allows a malicious user to modify the prototype of \"Object\" via __proto__, causing the addition or modification of an existing property that will exist on all objects.\n\n<p>Publish Date: 2018-06-07\n<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-3721>CVE-2018-3721</a></p>\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>\n<p>\n\nBase Score Metrics:\n- Exploitability Metrics:\n - Attack Vector: Network\n - Attack Complexity: Low\n - Privileges Required: Low\n - User Interaction: None\n - Scope: Unchanged\n- Impact Metrics:\n - Confidentiality Impact: None\n - Integrity Impact: High\n - Availability Impact: None\n</p>\nFor more information on CVSS3 Scores, click <a href=\"https://www.first.org/cvss/calculator/3.0\">here</a>.\n</p>\n</details>\n<p></p>\n<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>\n<p>\n\n<p>Type: Upgrade version</p>\n<p>Origin: <a href=\"https://nvd.nist.gov/vuln/detail/CVE-2018-3721\">https://nvd.nist.gov/vuln/detail/CVE-2018-3721</a></p>\n<p>Release Date: 2018-06-07</p>\n<p>Fix Resolution: 4.17.5</p>\n\n</p>\n</details>\n<p></p>\n\n***\nStep up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Improve readme","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# 405 Method Not Allowed: When trying to run/save lua script\n\nWhen trying to run a lua script I get an error 405 Method Not Allowed.\r\n\r\nI translated your website and I saw that ou mentionned the issue. I did try to use the unlock function but it did not work. I also tried disconnecting and reconnecting the card.\r\n\r\nI have a W-04 flashair.\r\n\r\nI did not see any documentation of support for a direct PUT method, maybe it would be more robust to use the upload.cgi. If you are open to that I might submit a pull request for this to update the worker.js to rather use upload.cgi commands.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Errors using LDAP auth backend","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Please add me - Oslo, Norway","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Installation steps in Fedora","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Update minishift install instructions with template callouts","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add instructions or a script for re-generating the list of functions in Fuchsia's core JIT snapshot","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Intercom initialization","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Requiring files for coverage not working properly","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Documentation out of date","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Windows Kernelspace proxier support","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Source for the Plant Monitoring System Notification thro Respeaker","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Page 14 is missing\n\nPage 14 is missing from OpenCore/Docs/Differences.pdf","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Peak Orientation","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"/usr/local/src/a2billing/vendor does not exist","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Add a README.md file for the Benchmark folder","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Provide deployment instructions for deploying to prod kubernetes","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Add Selenium Testing to main repository","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Program documentation. Not inspired by jdocs.. more like Google's python style guide.\n\n","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"As a user, I want to have simple instructions for getting your google calendar loaded up","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Database Documentation (II)","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Importing User Certificate in new iOS Cisco AnyConnect App","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# \"Cannot list available dashboards\" error occurred after updating to lates dashboard version\n\n**Description:**\r\n``Cannot list available dashboards`` error was shown after login into the portal after updating the APIM-Analytitcs pack to newest released carbon-dashboard version.\r\n\r\nIn the console log followig part was printed. \r\n\r\n```\r\n[2019-08-10 23:33:07,781] WARN {org.wso2.carbon.kernel.internal.startupresolver.StartupComponentManager} - You are trying to add an available capability org.wso2.carbon.uiserver.spi.RestApiProvider from bundle(org.wso2.carbon.dashboards.api:4.0.68) to an already satisfied startup listener component carbon-ui-server-startup-listener in bundle(org.wso2.carbon.uiserver:1.0.2). Either specify the capability in the Carbon-Component manifest header or explicitly skip the Startup Order Resolver. Refer the Startup Order Resolver documentation for more information.\r\n```\r\n\r\n**Suggested Labels:**\r\nPriority/High, Type/Bug, Severity/Major\r\n\r\n**Affected Product Version:**\r\ncarbon-dashboards 4.0.68\r\n\r\n**OS, DB, other environment details and versions:** \r\nMac OS Mojave 10.14.6\r\n\r\n**Steps to reproduce:**\r\nUpdate the carbon dashboard version of the APIM-Analytics pack to carbon-dashboards 4.0.68 and run the dashboard.sh ","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Build failing on OS X","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Problem: no man page for zmq_msg[_set]_group APis","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Teach/Inform new officers regarding scheduling events\n\nReview all the required documentation, websites, authorizations, restrictions, and budgets regarding setting up events for students to attend. Events can range from movie nights to detailed employer events or even tours of facilities.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Disconnected from the server after loading the example data","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Man page fails to explain what editorconfig is or does","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Hosted docs\n\n## \ud83d\udcda Documentation\r\n\r\nI think now would be a good time for us to start hosting our API docs on a public site. I view this as complimentary to, but connected to our current [tutorials page](https://napari.github.io/napari-tutorials/)\r\n\r\nUsing readthedocs is fine with me, I don't know if CZI will pay to remove adds, but that would be reasonable. Alternatively there is https://github.com/freeman-lab/minidocs made by our very own @freeman-lab or direct integration with our tutorials page.\r\n\r\nThoughts @kne42 @AhmetCanSolak @jni? ","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Using multiple nodes (Scaling phpsocket.io on multiple servers and processes)","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Still reproduced in IE11 after update to version 12.0.1: unable to find bean reference csvCreator while initialising #1778","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Autolink member properties and methods in class documentation","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Unable to discover proxmox disks\n\nThis issue captures the slack conversations w.r.t inability of NDM to discover certain disks.\r\n\r\n:question: Hello all,\r\nI continue my journey to understand openebs storage.\r\nReading the documentation regarding the cStor I added 8 * 32G disks to each nodes in my cluster. Actually there's 3 nodes in my cluster so I assume that I will seen 8 * 3 = 24 blockdevices in my clusters. But there's only 8:\r\n\r\n```bash\r\n$ kubectl get disk\r\nNAME SIZE STATE AGE\r\ndisk-10cbf5d6bd8330920fcb3bdda1aa6a55 34359738368 Active 35m\r\ndisk-11ebb418e80395b0378eca084e2a17e4 34359738368 Active 35m\r\ndisk-169068a2a396a22759d5bfa04f4a1b04 34359738368 Active 18m\r\ndisk-51e5c068c21f5186044957960dfbaefb 34359738368 Active 36m\r\ndisk-9d0e2efbf255eec778874a9fd42f1a72 34359738368 Active 31m\r\ndisk-c2b57cce00020b1d1562b9c469a949fa 34359738368 Active 31m\r\ndisk-dae3cec149cfd5fc5c6826caa0650e21 34359738368 Active 31m\r\ndisk-db1254ebd777a99e6b9b5626358c7038 34359738368 Active 17m\r\n```\r\n\r\nIt seems the 8 disks are dispatched between two nodes k8snodv01p and k8snodv03p:\r\n```bash\r\n$ kubectl describe disk | grep -A3 \"Name: disk-\"|grep \"Labels: kubernetes.io/hostname=\"|sort\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\nLabels: kubernetes.io/hostname=k8snodv03p\r\nLabels: kubernetes.io/hostname=k8snodv03p\r\nLabels: kubernetes.io/hostname=k8snodv03p\r\nLabels: kubernetes.io/hostname=k8snodv03p\r\n```\r\n\r\n:confused: Does that mean that no disks are seen from my second node k8snodv02p ?\r\nSorry to ask that question maybe it's very simple but how can I ensure all my 24 disks are seen from openebs perspective ? \r\n\r\n:question: Sai Chaithanya:mayadata_m: \r\nIs ndm daemon pod running on nodev02p node? (edited)\r\n\r\n:bulb: Wilf1rst:\r\nThe ndm pod running on each node\r\n```bash\r\n$ kubectl -n openebs get pods -o wide\r\nNAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES\r\nopenebs-admission-server-797586bf87-w9s54 1/1 Running 0 14h 10.42.2.56 k8snodv03p <none> <none>\r\nopenebs-apiserver-5c87d744b8-sff4f 1/1 Running 0 14h 10.42.2.57 k8snodv03p <none> <none>\r\nopenebs-localpv-provisioner-6f74f55454-rwpkm 1/1 Running 1 14h 10.42.3.39 k8snodv01p <none> <none>\r\nopenebs-ndm-2qkj6 1/1 Running 0 14h 172.16.0.7 k8snodv01p <none> <none>\r\nopenebs-ndm-fg7z5 1/1 Running 0 14h 172.16.0.8 k8snodv02p <none> <none>\r\nopenebs-ndm-fhflg 1/1 Running 0 14h 172.16.0.9 k8snodv03p <none> <none>\r\nopenebs-ndm-operator-55cc69dc96-mdplv 1/1 Running 0 14h 10.42.2.58 k8snodv03p <none> <none>\r\nopenebs-provisioner-78b964ffb9-kk46k 1/1 Running 1 14h 10.42.3.40 k8snodv01p <none> <none>\r\nopenebs-snapshot-operator-56c69dcb58-sw4vj 2/2 Running 1 14h 10.42.1.47 k8snodv02p <none> <none>\r\n```\r\n\r\n:spiral_notepad: Also there's a view of my disk from my hosts perspective (I just check and it's the same on my 3 k8s hosts):\r\n```bash\r\n$ sudo fdisk -l|grep \"^Disk\"\r\nDisk /dev/sda: 107.4 GB, 107374182400 bytes, 209715200 sectors\r\nDisk label type: dos\r\nDisk identifier: 0x000b956b\r\nDisk /dev/sdb: 34.4 GB, 34359738368 bytes, 67108864 sectors\r\nDisk /dev/sdc: 34.4 GB, 34359738368 bytes, 67108864 sectors\r\nDisk /dev/sdd: 34.4 GB, 34359738368 bytes, 67108864 sectors\r\nDisk /dev/sde: 34.4 GB, 34359738368 bytes, 67108864 sectors\r\nDisk /dev/sdf: 34.4 GB, 34359738368 bytes, 67108864 sectors\r\nDisk /dev/sdg: 34.4 GB, 34359738368 bytes, 67108864 sectors\r\nDisk /dev/sdh: 34.4 GB, 34359738368 bytes, 67108864 sectors\r\nDisk /dev/sdi: 34.4 GB, 34359738368 bytes, 67108864 sectors\r\n```\r\n\r\nWilf1rst 18 hours ago\r\nHere's the last line logs from the pods openebs-ndm-fg7z5 on k8snov02p:\r\n```\r\n:\"QEMU_HARDDISK\", Serial:\"drive-scsi8\", Vendor:\"QEMU\", Path:\"/dev/sdi\", ByIdDevLinks:[]string{\"/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi8\"}, ByPathDevLinks:[]string{\"/dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:8\"}, FirmwareRevision:\"\", LogicalSectorSize:0x200, PhysicalSectorSize:0x0, Compliance:\"\", DeviceType:\"\", PartitionType:\"\", FileSystemInfo:controller.FSInfo{FileSystem:\"None\", MountPoint:\"\"}}\r\nI0810 10:44:20.679292 6 blockdevicestore.go:98] Updated blockdevice object : blockdevice-db1254ebd777a99e6b9b5626358c7038\r\n```\r\nIt seems that a disk is detected.\r\n\r\n:question: Sai Chaithanya:mayadata_m:\r\nCan you pasted the output of kubectl get bd -n openebs\r\n\r\n:bulb: Wilf1rst \r\nHere it is:\r\n```bash\r\n$ kubectl get bd -n openebs\r\nNAME SIZE CLAIMSTATE STATUS AGE\r\nblockdevice-10cbf5d6bd8330920fcb3bdda1aa6a55 34359738368 Unclaimed Active 62m\r\nblockdevice-11ebb418e80395b0378eca084e2a17e4 34359738368 Unclaimed Active 62m\r\nblockdevice-169068a2a396a22759d5bfa04f4a1b04 34359738368 Unclaimed Active 45m\r\nblockdevice-51e5c068c21f5186044957960dfbaefb 34359738368 Unclaimed Active 63m\r\nblockdevice-9d0e2efbf255eec778874a9fd42f1a72 34359738368 Unclaimed Active 58m\r\nblockdevice-c2b57cce00020b1d1562b9c469a949fa 34359738368 Unclaimed Active 58m\r\nblockdevice-dae3cec149cfd5fc5c6826caa0650e21 34359738368 Unclaimed Active 58m\r\nblockdevice-db1254ebd777a99e6b9b5626358c7038 34359738368 Unclaimed Active 44m\r\nsparse-74a30cf9fb7f5f8dc8be1f3c5cb5bed3 10737418240 Unclaimed Active 14h\r\nsparse-a29c08ed690c05073f0e149247a620d9 10737418240 Unclaimed Active 14h\r\nsparse-ddea1324282dfec8b75645302e8b54b7 10737418240 Unclaimed Active 14h\r\n```\r\n\r\n:spiral_notepad: Regarding what I said previously that a disk is detected on my k8snodv02p ndm pod, it seems that the blockdevice is from my k8snodv01p:\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-db1254ebd777a99e6b9b5626358c7038\r\nName: blockdevice-db1254ebd777a99e6b9b5626358c7038\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:43:45Z\r\n Generation: 1\r\n Resource Version: 2261226\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-db1254ebd777a99e6b9b5626358c7038\r\n UID: ba9b82df-bb5b-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi8\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi8\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:8\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sdi\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:question: ranjithr005:mayadata_m: \r\nCan we ensure the serial number of these disks are different. Using kubctl describe bd <bd-name> -n openebs, you will get the serial number of the disks.\r\n\r\n:bulb: Wilf1rst\r\nblockdevice-10cbf5d6bd8330920fcb3bdda1aa6a55\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-10cbf5d6bd8330920fcb3bdda1aa6a55\r\nName: blockdevice-10cbf5d6bd8330920fcb3bdda1aa6a55\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv03p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:25:33Z\r\n Generation: 1\r\n Resource Version: 2257365\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-10cbf5d6bd8330920fcb3bdda1aa6a55\r\n UID: 2f820cd9-bb59-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi2\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi2\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:2\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sdc\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:spiral_notepad: blockdevice-11ebb418e80395b0378eca084e2a17e4\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-11ebb418e80395b0378eca084e2a17e4\r\nName: blockdevice-11ebb418e80395b0378eca084e2a17e4\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv03p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:26:00Z\r\n Generation: 1\r\n Resource Version: 2257466\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-11ebb418e80395b0378eca084e2a17e4\r\n UID: 3f6d5cbb-bb59-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi3\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi3\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:3\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sdd\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:spiral_notepad: blockdevice-169068a2a396a22759d5bfa04f4a1b04\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-169068a2a396a22759d5bfa04f4a1b04\r\nName: blockdevice-169068a2a396a22759d5bfa04f4a1b04\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv03p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:42:16Z\r\n Generation: 1\r\n Resource Version: 2260911\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-169068a2a396a22759d5bfa04f4a1b04\r\n UID: 854381b1-bb5b-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi7\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi7\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:7\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sdh\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:spiral_notepad: blockdevice-51e5c068c21f5186044957960dfbaefb\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-51e5c068c21f5186044957960dfbaefb\r\nName: blockdevice-51e5c068c21f5186044957960dfbaefb\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv03p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:24:59Z\r\n Generation: 1\r\n Resource Version: 2257294\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-51e5c068c21f5186044957960dfbaefb\r\n UID: 1b720aa1-bb59-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi1\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi1\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:1\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sdb\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:spiral_notepad: blockdevice-9d0e2efbf255eec778874a9fd42f1a72\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-9d0e2efbf255eec778874a9fd42f1a72\r\nName: blockdevice-9d0e2efbf255eec778874a9fd42f1a72\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:29:18Z\r\n Generation: 1\r\n Resource Version: 2258344\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-9d0e2efbf255eec778874a9fd42f1a72\r\n UID: b5815988-bb59-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi4\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi4\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:4\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sde\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:spiral_notepad: blockdevice-c2b57cce00020b1d1562b9c469a949fa\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-c2b57cce00020b1d1562b9c469a949fa\r\nName: blockdevice-c2b57cce00020b1d1562b9c469a949fa\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:29:40Z\r\n Generation: 1\r\n Resource Version: 2258429\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-c2b57cce00020b1d1562b9c469a949fa\r\n UID: c2941896-bb59-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi5\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi5\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:5\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sdf\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:spiral_notepad: blockdevice-dae3cec149cfd5fc5c6826caa0650e21\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-dae3cec149cfd5fc5c6826caa0650e21\r\nName: blockdevice-dae3cec149cfd5fc5c6826caa0650e21\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:30:04Z\r\n Generation: 1\r\n Resource Version: 2258639\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-dae3cec149cfd5fc5c6826caa0650e21\r\n UID: d14cde9e-bb59-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi6\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi6\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:6\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sdg\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:spiral_notepad: blockdevice-db1254ebd777a99e6b9b5626358c7038\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-db1254ebd777a99e6b9b5626358c7038\r\nName: blockdevice-db1254ebd777a99e6b9b5626358c7038\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T10:43:45Z\r\n Generation: 1\r\n Resource Version: 2261226\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-db1254ebd777a99e6b9b5626358c7038\r\n UID: ba9b82df-bb5b-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: QEMU_HARDDISK\r\n Serial: drive-scsi8\r\n Vendor: QEMU\r\n Devlinks:\r\n Kind: by-id\r\n Links:\r\n /dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi8\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:05.0-scsi-0:0:0:8\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/sdi\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\nWilf1rst:\r\n:spiral_notepad: Here's all my bd description :slightly_smiling_face:\r\n\r\n:nerd_face: vitta-at-mayadata:mayadata_m: \r\nNDM detects that disks are different by looking at Model, Serial and vendor..\r\nin this case, if I'm not wrong, one of the disk in each node is having serial as 'drive-scsi*'\r\n\r\n:bulb: vitta-at-mayadata:mayadata_m: \r\ncan we make sure that disks in entire cluster (across nodes) are having different serial numbers?\r\n\r\nWilf1rst\r\n:thinking: Indeed I only see one drive-scsi serial between 1 and 8:\r\n```yaml\r\n$ kubectl -n openebs describe bd |grep -A30 \"Name: blockdevice-\"|grep -i serial|sort\r\n Serial: drive-scsi1\r\n Serial: drive-scsi2\r\n Serial: drive-scsi3\r\n Serial: drive-scsi4\r\n Serial: drive-scsi5\r\n Serial: drive-scsi6\r\n Serial: drive-scsi7\r\n Serial: drive-scsi8\r\n```\r\n\r\n:spiral_notepad: As my virtualisation host is a **Proxmox server**, I assume on each VMs there's a mounted disk with serial drive-scsi1, drive-scsi2 and so on ...So as it seams that the serial is binded on the scsi bus / device I've to set another scsi device if I understand well:\r\nLike\r\n```\r\n* 1 to 8 for my node1\r\n* 9 to 16 for my node2\r\n* 17 to 24 form my node3 (edited)\r\n```\r\n\r\nWilf1rst \r\n:bulb: Thanks i better understand how openebs detect bd\r\n\r\n:nerd_face: vitta-at-mayadata:mayadata_m: \r\nperfect.. if there is another possibility of enabling WWN on virtual machines (similar to vsphere), it would be better\r\n\r\n:question: vitta-at-mayadata:mayadata_m: \r\nBTW, whats the virtualization platform you are using?\r\n\r\nWilf1rst \r\n:spiral_notepad: Here i use [proxmox](https://www.proxmox.com/en/).\r\n```\r\nproxmox.comproxmox.com\r\nProxmox - powerful open-source server solutions\r\nProxmox offers the server virtualization management platform Proxmox VE, and the Proxmox Mail Gateway an antispam and antivirus solution for mail server protection.\r\n```\r\n\r\n:spiral_notepad: vitta-at-mayadata:mayadata_m:\r\nthanks @Wilf1rst.. will go thru that to see options for enabling WWNs.. cc: @akhilerm \r\n\r\n:spiral_notepad: Wilf1rst\r\nI use to works with vSphere at work but hereI use proxmox for it's LXC support and i don't see a simple option to set my WWN in my disk creation. I will check on proxmox doc\r\n\r\nvitta-at-mayadata:mayadata_m: \r\n:+1: ok.. if we make sure different serials for all disks, it works\r\n\r\nWilf1rst\r\n:+1: To add to the information I just found that I can add VirtIO Block disk with the same device number in the proxmox console and the two are seen correctly by openebs.\r\n\r\nWilf1rst\r\nBy the way there's no serial set:\r\n```yaml\r\n$ kubectl -n openebs describe bd blockdevice-e3860833c7243177f4a60fe0b9a01aac\r\nName: blockdevice-e3860833c7243177f4a60fe0b9a01aac\r\nNamespace: openebs\r\nLabels: kubernetes.io/hostname=k8snodv01p\r\n ndm.io/blockdevice-type=blockdevice\r\n ndm.io/managed=true\r\nAnnotations: <none>\r\nAPI Version: openebs.io/v1alpha1\r\nKind: BlockDevice\r\nMetadata:\r\n Creation Timestamp: 2019-08-10T12:22:34Z\r\n Generation: 1\r\n Resource Version: 2284595\r\n Self Link: /apis/openebs.io/v1alpha1/namespaces/openebs/blockdevices/blockdevice-e3860833c7243177f4a60fe0b9a01aac\r\n UID: 882f3157-bb69-11e9-8221-0200005283a7\r\nSpec:\r\n Capacity:\r\n Logical Sector Size: 512\r\n Physical Sector Size: 0\r\n Storage: 34359738368\r\n Details:\r\n Compliance: \r\n Device Type: \r\n Firmware Revision: \r\n Model: \r\n Serial: \r\n Vendor: \r\n Devlinks:\r\n Kind: by-path\r\n Links:\r\n /dev/disk/by-path/pci-0000:00:0b.0\r\n /dev/disk/by-path/virtio-pci-0000:00:0b.0\r\n Filesystem:\r\n Partitioned: No\r\n Path: /dev/vda\r\nStatus:\r\n Claim State: Unclaimed\r\n State: Active\r\nEvents: <none>\r\n```\r\n\r\n:spiral_notepad: vitta-at-mayadata:mayadata_m:\r\nchecking with team on the steps of identifying the disks by NDM\r\n\r\nWilf1rst \r\n:heart: Thanks a lot. Your product rocks and you the dev are very helpful. Thank you again\r\n\r\n:spiral_notepad: vitta-at-mayadata:mayadata_m: \r\nthis might help - https://docs.openebs.io/docs/next/ndm.html#ndm-daemonset-functions\r\nhowever, I see that this can be improved..","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Complete documentation of command line flags","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# torch7 move to ATen?\n\nWill this repository be moving from torch7 to ATen?\r\n\r\nIt seems torch7 is deprecated and ATen is recommended:\r\n\"Torch is not in active development\"\r\nhttps://github.com/torch/torch7/blob/master/README.md\r\n\r\nI have real and significant issue with syntax, \r\nand learning new languages is problematic.\r\nHave Jetson Nano and scoping possible issues around implementing this repository\r\napologies if I am off-piste.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# firebase.auth().currentUser returns null\n\n## Issue\r\n\r\nI'm having a hard time getting this to work. I have an app build with react-native and I am using a Firebase database. I use react-native-firebase to try to retrieve the current user. The user can sign up/sign in and everything works as expected so there is no problem here (this does not use react-native-firebase, if needed I can post my auth script here)\r\nI have a \"add place\" page, on this page users upload a photo with some data to the firebase database. On another page users can view this place with the photo and extra data. I would now like to store the Firebase user UID with the photo in the database as well. The idea is to check wether the current signed in user is equal to the user UID stored with the image, if so a delete button is shown so only the user that uploaded the picture can remove it.\r\n\r\nOn my authentication screen I have added the following;\r\n\r\n` \r\nimport firebase from 'react-native-firebase';\r\n\r\n...\r\n\r\ncomponentDidMount() {\r\n console.log(\"ComponentdidMount() Called.\"); \r\n this.FirebaseID = firebase.auth().currentUser\r\n console.log(\"FirebaseID =\", this.FirebaseID); \r\n console.log(\"current user =\", firebase.auth().currentUser);\r\n console.log(\"Firebase all =\", firebase.auth()); \r\n this.props.onAutoSignin();\r\n }`\r\n\r\nI have added the above code also to my add place page to check whatever information would be shown in the React Native Debugger logs. Only with the Firebase all rule I can view information about the logged in user, however user and current user remain null. \r\n\r\n---\r\n\r\n## Project Files\r\n\r\n<!-- Provide the contents of key project files which will help to debug -->\r\n<!-- For Example: -->\r\n<!-- - iOS: `Podfile` contents. -->\r\n<!-- - Android: `android/build.gradle` contents. -->\r\n<!-- - Android: `android/app/build.gradle` contents. -->\r\n<!-- - Android: `AndroidManifest.xml` contents. -->\r\n\r\n<!-- ADD THE CONTENTS OF THE FILES IN THE PROVIDED CODE BLOCKS BELOW -->\r\n\r\n### Android\r\n\r\n<details><summary>Click To Expand</summary>\r\n<p>\r\n\r\n#### Have you converted to AndroidX?\r\n\r\nNo I have not\r\n\r\n#### `android/build.gradle`:\r\n\r\n```groovy\r\n dependencies {\r\n ......\r\n classpath 'com.google.gms:google-services:3.2.1'\r\n```\r\n\r\n#### `android/app/build.gradle`:\r\n\r\n```groovy\r\ndependencies {\r\n implementation project(':react-native-firebase')\r\n\r\n ...\r\n\r\n // Firebase dependencies\r\n implementation \"com.google.android.gms:play-services-base:16.1.0\"\r\n implementation \"com.google.firebase:firebase-core:16.0.9\"\r\n implementation \"com.google.firebase:firebase-auth:16.0.2\"\r\n```\r\n\r\n#### `android/settings.gradle`:\r\n\r\n```groovy\r\n\r\n...\r\n\r\ninclude ':react-native-firebase'\r\nproject(':react-native-firebase').projectDir = new File(rootProject.projectDir, '../node_modules/react-native-firebase/android')\r\n\r\n...\r\n\r\n```\r\n\r\n#### `MainApplication.java`:\r\n\r\n```java\r\n\r\n...\r\n\r\nimport io.invertase.firebase.RNFirebasePackage;\r\nimport io.invertase.firebase.auth.RNFirebaseAuthPackage;\r\n\r\n...\r\n\r\n protected List<ReactPackage> getPackages() {\r\n // Add additional packages you require here\r\n // No need to add RnnPackage and MainReactPackage\r\n return Arrays.<ReactPackage>asList(\r\n...\r\n new RNFirebasePackage(),\r\n \r\n...\r\n new RNFirebaseAuthPackage()\r\n\r\n\r\n```\r\n\r\n#### `AndroidManifest.xml`:\r\n\r\n```xml\r\nNo reference to RNFirebase here\r\n```\r\n\r\n</p>\r\n</details>\r\n\r\n\r\n---\r\n\r\n## Environment\r\n\r\n<details><summary>Click To Expand</summary>\r\n<p>\r\n\r\n**`react-native info` output:**\r\n\r\n<!-- Please run `react-native info` on your terminal and paste the contents into the code block below -->\r\n\r\n```\r\n React Native Environment Info:\r\n System:\r\n OS: macOS 10.14.5\r\n CPU: (4) x64 Intel(R) Core(TM) i5-4570 CPU @ 3.20GHz\r\n Memory: 618.70 MB / 16.00 GB\r\n Shell: 3.2.57 - /bin/bash\r\n Binaries:\r\n Node: 10.15.3 - /usr/local/bin/node\r\n Yarn: 1.16.0 - ~/.yarn/bin/yarn\r\n npm: 6.4.1 - /usr/local/bin/npm\r\n SDKs:\r\n iOS SDK:\r\n Platforms: iOS 12.4, macOS 10.14, tvOS 12.4, watchOS 5.3\r\n Android SDK:\r\n API Levels: 22, 23, 24, 25, 26, 27, 28\r\n Build Tools: 25.0.1, 26.0.1, 26.0.2, 27.0.3, 28.0.3, 29.0.1\r\n System Images: android-24 | Google Play Intel x86 Atom, android-25 | Google Play Intel x86 Atom, android-27 | Google APIs Intel x86 Atom, android-27 | Google Play Intel x86 Atom, android-28 | Intel x86 Atom, android-28 | Google APIs Intel x86 Atom\r\n IDEs:\r\n Android Studio: 3.4 AI-183.6156.11.34.5692245\r\n Xcode: 10.3/10G8 - /usr/bin/xcodebuild\r\n npmPackages:\r\n react: 16.6.3 => 16.6.3 \r\n react-native: 0.58.6 => 0.58.6 \r\n npmGlobalPackages:\r\n create-react-native-app: 2.0.2\r\n react-native-cli: 2.0.1```\r\n\r\n<!-- change `[ ]` to `[x]` to select an option(s) -->\r\n\r\n- **Platform that you're experiencing the issue on**:\r\n - [ ] iOS\r\n - [ ] Android\r\n - [ ] **iOS** but have not tested behavior on Android\r\n - [X] **Android** but have not tested behavior on iOS\r\n - [ ] Both\r\n- **`react-native-firebase` version you're using that has this issue:**\r\nSorry I can't find this, how can I check easy? \r\n- **`Firebase` module(s) you're using that has the issue:**\r\nauth? \r\n- **Are you using `TypeScript`?**\r\n - `N`\r\n \r\n</p>\r\n</details>\r\n\r\nSo in short, I want to find out the current user and upload this as extra information to the Firebase database to user on a different view but I am unable to get the correct information with firebase.auth().currentUser. \r\n\r\n<!-- Thanks for reading this far down \u2764\ufe0f -->\r\n<!-- High quality, detailed issues are much easier to triage for maintainers -->\r\n\r\n<!-- For bonus points, if you put a \ud83d\udd25 (:fire:) emojii at the start of the issue title we'll know -->\r\n<!-- that you took the time to fill this out correctly, or, at least read this far -->\r\n\r\n---\r\n\r\nThink `react-native-firebase` is great? Please consider supporting all of the project maintainers and contributors by donating via our [Open Collective](https://opencollective.com/react-native-firebase/donate) where all contributors can submit expenses. [[Learn More]](https://invertase.io/oss/react-native-firebase/contributing/donations-expenses)\r\n\r\n- \ud83d\udc49 Check out [`React Native Firebase`](https://twitter.com/rnfirebase) and [`Invertase`](https://twitter.com/invertaseio) on Twitter for updates on the library.\r\n\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"more module webpage","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Proposal: deprecate SQTagUtil.java","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Reactive forms valueChanges method fires twice for one change on input fields.","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Margin on mobile\n\n@johno As you may know, we're using this theme for our documentation at Learn Anything (here is our [repo](https://github.com/learn-anything/docs)) ... and we've noticed that there isn't any margins for the mobile view. Here is a screenshot:\r\n\r\n<img src=\"https://user-images.githubusercontent.com/30328854/62826347-ce85ee00-bbb1-11e9-918a-7c2c52816821.jpg\" width=\"200\" />\r\n\r\nWould love a fix for this \ud83d\ude4f","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"# Why does TypeDoc think all modules are missing? (Error: Could not find a declaration file for module)\n\nI'm not having luck with generating TypeScript docs. I've been trying various tools, including TypeDoc, but no luck so far.\r\n\r\n- [x] I have checked [issues with bug label](https://github.com/TypeStrong/typedoc/labels/bug) and found no duplicates\r\n\r\n## Expected Behavior\r\n\r\nI have a project that compiles fine with `tsc`, `gulp-typescript` and `webpack`.\r\n\r\nI am hoping to be able to run something like\r\n\r\n```\r\ntypedoc --out ./typedocs --mode modules --excludeExternals --excludeNotExported --excludePrivate --excludeProtected ../infamous+infamous/src/index.ts\r\n```\r\n\r\nand get generated docs.\r\n\r\n## Actual Behavior\r\n<!--\r\n What does Typedoc fail to do? \r\n-->\r\n\r\nI get a bunch of errors like\r\n\r\n```\r\n...\r\nError: /Users/trusktr/src/infamous+infamous/src/lib/jss/index.ts(2)\r\n Could not find a declaration file for module 'jss-nested'. '/Users/trusktr/src/infamous+infamous/node_modules/jss-nested/lib/index.js' implicitly has an 'any' type.\r\n Try `npm install @types/jss-nested` if it exists or add a new declaration (.d.ts) file containing `declare module 'jss-nested';`\r\nError: /Users/trusktr/src/infamous+infamous/src/lib/jss/index.ts(3)\r\n Could not find a declaration file for module 'jss-extend'. '/Users/trusktr/src/infamous+infamous/node_modules/jss-extend/lib/index.js' implicitly has an 'any' type.\r\n Try `npm install @types/jss-extend` if it exists or add a new declaration (.d.ts) file containing `declare module 'jss-extend';`\r\n...\r\n```\r\n\r\n\r\n## Environment\r\n - Typedoc version: 0.15.0\r\n - Node.js version: 12.7.0\r\n - OS: macOS\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"Describe streaming arguments in docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"tutorials/first-mvc-app/adding-model include","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"Is it possible to show the added section collapsed?","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"[docs] Create \"register your first application\" quick start guide","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"CORS issues","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"v1.1 Project Page and Project Browse Page","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"An error in the docs","cats":{"DOCUMENTATION":1.0,"OTHER":0.0}}
{"text":"HLAx without docker","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}
{"text":"# Please anyone can help me \n\n## I'm submitting a...\r\n<!-- Check one of the following options with \"x\" -->\r\n<pre><code>\r\n[ ] Bug report <!-- Please search GitHub for a similar issue or PR before submitting -->\r\n[ ] Feature request\r\n[ ] Documentation issue or request\r\n</code></pre>\r\n\r\n## In package\r\n<pre><code>\r\n[ ] @ng-toolkit/init\r\n[ ] @ng-toolkit/serverless\r\n[ ] @ng-toolkit/universal\r\n[ ] @ng-toolkit/pwa\r\n[ ] @ng-toolkit/firebug\r\n</code></pre>\r\n\r\n## Current behavior\r\n<!-- Describe how the issue manifests. -->\r\n\r\n\r\n## Expected behavior\r\n<!-- Describe what the desired behavior would be. -->\r\n\r\n\r\n## Minimal reproduction of the problem with instructions\r\n<!--\r\nFor bug reports please provide the *STEPS TO REPRODUCE*\r\n-->\r\n\r\n## Example repository\r\n<!--\r\nPlease provide link to your public, cloneable repository, to give us a chance to reproduce your issue on our end\r\n-->\r\n\r\n## What is the motivation / use case for changing the behavior?\r\n<!-- Describe the motivation or the concrete use case. -->\r\n\r\n\r\n## Environment\r\n\r\n<pre><code>\r\nAngular version: X.Y.Z\r\n<!-- Check whether this is still an issue in the most recent Angular version -->\r\n\r\n- Node version: XX <!-- run `node --version` -->\r\n- Platform: <!-- Mac, Linux, Windows -->\r\n\r\nOthers:\r\n<!-- Anything else relevant? Operating system version, IDE, package manager, HTTP server, ... -->\r\n</code></pre>\r\n","cats":{"DOCUMENTATION":0.0,"OTHER":1.0}}